Feb 23 06:36:01 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Feb 23 06:36:01 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 23 06:36:01 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 23 06:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 23 06:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 23 06:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 23 06:36:01 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 23 06:36:01 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Feb 23 06:36:01 localhost kernel: signal: max sigframe size: 1776
Feb 23 06:36:01 localhost kernel: BIOS-provided physical RAM map:
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 23 06:36:01 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Feb 23 06:36:01 localhost kernel: NX (Execute Disable) protection: active
Feb 23 06:36:01 localhost kernel: SMBIOS 2.8 present.
Feb 23 06:36:01 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 23 06:36:01 localhost kernel: Hypervisor detected: KVM
Feb 23 06:36:01 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 23 06:36:01 localhost kernel: kvm-clock: using sched offset of 2904884916 cycles
Feb 23 06:36:01 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 23 06:36:01 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 23 06:36:01 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 23 06:36:01 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 23 06:36:01 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Feb 23 06:36:01 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 23 06:36:01 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 23 06:36:01 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 23 06:36:01 localhost kernel: Using GB pages for direct mapping
Feb 23 06:36:01 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Feb 23 06:36:01 localhost kernel: ACPI: Early table checksum verification disabled
Feb 23 06:36:01 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 23 06:36:01 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 06:36:01 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 06:36:01 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 06:36:01 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 23 06:36:01 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 06:36:01 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 23 06:36:01 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 23 06:36:01 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 23 06:36:01 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 23 06:36:01 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 23 06:36:01 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 23 06:36:01 localhost kernel: No NUMA configuration found
Feb 23 06:36:01 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Feb 23 06:36:01 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Feb 23 06:36:01 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Feb 23 06:36:01 localhost kernel: Zone ranges:
Feb 23 06:36:01 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 23 06:36:01 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 23 06:36:01 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Feb 23 06:36:01 localhost kernel:   Device   empty
Feb 23 06:36:01 localhost kernel: Movable zone start for each node
Feb 23 06:36:01 localhost kernel: Early memory node ranges
Feb 23 06:36:01 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 23 06:36:01 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 23 06:36:01 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Feb 23 06:36:01 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Feb 23 06:36:01 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 23 06:36:01 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 23 06:36:01 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 23 06:36:01 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 23 06:36:01 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 23 06:36:01 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 23 06:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 23 06:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 23 06:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 23 06:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 23 06:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 23 06:36:01 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 23 06:36:01 localhost kernel: TSC deadline timer available
Feb 23 06:36:01 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 23 06:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 23 06:36:01 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 23 06:36:01 localhost kernel: Booting paravirtualized kernel on KVM
Feb 23 06:36:01 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 23 06:36:01 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 23 06:36:01 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Feb 23 06:36:01 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Feb 23 06:36:01 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 23 06:36:01 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 23 06:36:01 localhost kernel: Fallback order for Node 0: 0 
Feb 23 06:36:01 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Feb 23 06:36:01 localhost kernel: Policy zone: Normal
Feb 23 06:36:01 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 23 06:36:01 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Feb 23 06:36:01 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Feb 23 06:36:01 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 23 06:36:01 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 23 06:36:01 localhost kernel: software IO TLB: area num 8.
Feb 23 06:36:01 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Feb 23 06:36:01 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Feb 23 06:36:01 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 23 06:36:01 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Feb 23 06:36:01 localhost kernel: ftrace: allocated 176 pages with 3 groups
Feb 23 06:36:01 localhost kernel: Dynamic Preempt: voluntary
Feb 23 06:36:01 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 23 06:36:01 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 23 06:36:01 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 23 06:36:01 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 23 06:36:01 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 23 06:36:01 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 23 06:36:01 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 23 06:36:01 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 23 06:36:01 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 23 06:36:01 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 23 06:36:01 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Feb 23 06:36:01 localhost kernel: Console: colour VGA+ 80x25
Feb 23 06:36:01 localhost kernel: printk: console [tty0] enabled
Feb 23 06:36:01 localhost kernel: printk: console [ttyS0] enabled
Feb 23 06:36:01 localhost kernel: ACPI: Core revision 20211217
Feb 23 06:36:01 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 23 06:36:01 localhost kernel: x2apic enabled
Feb 23 06:36:01 localhost kernel: Switched APIC routing to physical x2apic.
Feb 23 06:36:01 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 23 06:36:01 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 23 06:36:01 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 23 06:36:01 localhost kernel: LSM: Security Framework initializing
Feb 23 06:36:01 localhost kernel: Yama: becoming mindful.
Feb 23 06:36:01 localhost kernel: SELinux:  Initializing.
Feb 23 06:36:01 localhost kernel: LSM support for eBPF active
Feb 23 06:36:01 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 23 06:36:01 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 23 06:36:01 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 23 06:36:01 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 23 06:36:01 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 23 06:36:01 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 23 06:36:01 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 23 06:36:01 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Feb 23 06:36:01 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Feb 23 06:36:01 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 23 06:36:01 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 23 06:36:01 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 23 06:36:01 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 23 06:36:01 localhost kernel: Freeing SMP alternatives memory: 36K
Feb 23 06:36:01 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 23 06:36:01 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Feb 23 06:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 23 06:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 23 06:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 23 06:36:01 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 23 06:36:01 localhost kernel: ... version:                0
Feb 23 06:36:01 localhost kernel: ... bit width:              48
Feb 23 06:36:01 localhost kernel: ... generic registers:      6
Feb 23 06:36:01 localhost kernel: ... value mask:             0000ffffffffffff
Feb 23 06:36:01 localhost kernel: ... max period:             00007fffffffffff
Feb 23 06:36:01 localhost kernel: ... fixed-purpose events:   0
Feb 23 06:36:01 localhost kernel: ... event mask:             000000000000003f
Feb 23 06:36:01 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 23 06:36:01 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 23 06:36:01 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 23 06:36:01 localhost kernel: x86: Booting SMP configuration:
Feb 23 06:36:01 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 23 06:36:01 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 23 06:36:01 localhost kernel: smpboot: Max logical packages: 8
Feb 23 06:36:01 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 23 06:36:01 localhost kernel: node 0 deferred pages initialised in 23ms
Feb 23 06:36:01 localhost kernel: devtmpfs: initialized
Feb 23 06:36:01 localhost kernel: x86/mm: Memory block size: 128MB
Feb 23 06:36:01 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 23 06:36:01 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Feb 23 06:36:01 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 23 06:36:01 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 23 06:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Feb 23 06:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 23 06:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 23 06:36:01 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 23 06:36:01 localhost kernel: audit: type=2000 audit(1771828560.287:1): state=initialized audit_enabled=0 res=1
Feb 23 06:36:01 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 23 06:36:01 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 23 06:36:01 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 23 06:36:01 localhost kernel: cpuidle: using governor menu
Feb 23 06:36:01 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Feb 23 06:36:01 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 23 06:36:01 localhost kernel: PCI: Using configuration type 1 for base access
Feb 23 06:36:01 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 23 06:36:01 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 23 06:36:01 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Feb 23 06:36:01 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Feb 23 06:36:01 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Feb 23 06:36:01 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Feb 23 06:36:01 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Feb 23 06:36:01 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 23 06:36:01 localhost kernel: ACPI: Interpreter enabled
Feb 23 06:36:01 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 23 06:36:01 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 23 06:36:01 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 23 06:36:01 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 23 06:36:01 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 23 06:36:01 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 23 06:36:01 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [3] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [4] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [5] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [6] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [7] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [8] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [9] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [10] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [11] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [12] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [13] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [14] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [15] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [16] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [17] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [18] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [19] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [20] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [21] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [22] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [23] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [24] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [25] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [26] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [27] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [28] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [29] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [30] registered
Feb 23 06:36:01 localhost kernel: acpiphp: Slot [31] registered
Feb 23 06:36:01 localhost kernel: PCI host bridge to bus 0000:00
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Feb 23 06:36:01 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Feb 23 06:36:01 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Feb 23 06:36:01 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Feb 23 06:36:01 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Feb 23 06:36:01 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 23 06:36:01 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Feb 23 06:36:01 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Feb 23 06:36:01 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 23 06:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 23 06:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 23 06:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 23 06:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 23 06:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 23 06:36:01 localhost kernel: iommu: Default domain type: Translated 
Feb 23 06:36:01 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Feb 23 06:36:01 localhost kernel: SCSI subsystem initialized
Feb 23 06:36:01 localhost kernel: ACPI: bus type USB registered
Feb 23 06:36:01 localhost kernel: usbcore: registered new interface driver usbfs
Feb 23 06:36:01 localhost kernel: usbcore: registered new interface driver hub
Feb 23 06:36:01 localhost kernel: usbcore: registered new device driver usb
Feb 23 06:36:01 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 23 06:36:01 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 23 06:36:01 localhost kernel: PTP clock support registered
Feb 23 06:36:01 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 23 06:36:01 localhost kernel: NetLabel: Initializing
Feb 23 06:36:01 localhost kernel: NetLabel:  domain hash size = 128
Feb 23 06:36:01 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 23 06:36:01 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 23 06:36:01 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 23 06:36:01 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 23 06:36:01 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 23 06:36:01 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 23 06:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 23 06:36:01 localhost kernel: vgaarb: loaded
Feb 23 06:36:01 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 23 06:36:01 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 23 06:36:01 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 23 06:36:01 localhost kernel: pnp: PnP ACPI init
Feb 23 06:36:01 localhost kernel: pnp 00:03: [dma 2]
Feb 23 06:36:01 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 23 06:36:01 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 23 06:36:01 localhost kernel: NET: Registered PF_INET protocol family
Feb 23 06:36:01 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Feb 23 06:36:01 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Feb 23 06:36:01 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 23 06:36:01 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 23 06:36:01 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 23 06:36:01 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Feb 23 06:36:01 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Feb 23 06:36:01 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 23 06:36:01 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 23 06:36:01 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 23 06:36:01 localhost kernel: NET: Registered PF_XDP protocol family
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 23 06:36:01 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 23 06:36:01 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 23 06:36:01 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 23 06:36:01 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 34339 usecs
Feb 23 06:36:01 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 23 06:36:01 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 23 06:36:01 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 23 06:36:01 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 23 06:36:01 localhost kernel: ACPI: bus type thunderbolt registered
Feb 23 06:36:01 localhost kernel: Initialise system trusted keyrings
Feb 23 06:36:01 localhost kernel: Key type blacklist registered
Feb 23 06:36:01 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Feb 23 06:36:01 localhost kernel: zbud: loaded
Feb 23 06:36:01 localhost kernel: integrity: Platform Keyring initialized
Feb 23 06:36:01 localhost kernel: NET: Registered PF_ALG protocol family
Feb 23 06:36:01 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 23 06:36:01 localhost kernel: Key type asymmetric registered
Feb 23 06:36:01 localhost kernel: Asymmetric key parser 'x509' registered
Feb 23 06:36:01 localhost kernel: Running certificate verification selftests
Feb 23 06:36:01 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 23 06:36:01 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 23 06:36:01 localhost kernel: io scheduler mq-deadline registered
Feb 23 06:36:01 localhost kernel: io scheduler kyber registered
Feb 23 06:36:01 localhost kernel: io scheduler bfq registered
Feb 23 06:36:01 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 23 06:36:01 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 23 06:36:01 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 23 06:36:01 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 23 06:36:01 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 23 06:36:01 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 23 06:36:01 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 23 06:36:01 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 23 06:36:01 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 23 06:36:01 localhost kernel: Non-volatile memory driver v1.3
Feb 23 06:36:01 localhost kernel: rdac: device handler registered
Feb 23 06:36:01 localhost kernel: hp_sw: device handler registered
Feb 23 06:36:01 localhost kernel: emc: device handler registered
Feb 23 06:36:01 localhost kernel: alua: device handler registered
Feb 23 06:36:01 localhost kernel: libphy: Fixed MDIO Bus: probed
Feb 23 06:36:01 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Feb 23 06:36:01 localhost kernel: ehci-pci: EHCI PCI platform driver
Feb 23 06:36:01 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Feb 23 06:36:01 localhost kernel: ohci-pci: OHCI PCI platform driver
Feb 23 06:36:01 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Feb 23 06:36:01 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 23 06:36:01 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 23 06:36:01 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 23 06:36:01 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 23 06:36:01 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 23 06:36:01 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 23 06:36:01 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 23 06:36:01 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Feb 23 06:36:01 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 23 06:36:01 localhost kernel: hub 1-0:1.0: USB hub found
Feb 23 06:36:01 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 23 06:36:01 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 23 06:36:01 localhost kernel: usbserial: USB Serial support registered for generic
Feb 23 06:36:01 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 23 06:36:01 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 23 06:36:01 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 23 06:36:01 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 23 06:36:01 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 23 06:36:01 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 23 06:36:01 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 23 06:36:01 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-23T06:36:00 UTC (1771828560)
Feb 23 06:36:01 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 23 06:36:01 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 23 06:36:01 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 23 06:36:01 localhost kernel: usbcore: registered new interface driver usbhid
Feb 23 06:36:01 localhost kernel: usbhid: USB HID core driver
Feb 23 06:36:01 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 23 06:36:01 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 23 06:36:01 localhost kernel: Initializing XFRM netlink socket
Feb 23 06:36:01 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 23 06:36:01 localhost kernel: Segment Routing with IPv6
Feb 23 06:36:01 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 23 06:36:01 localhost kernel: mpls_gso: MPLS GSO support
Feb 23 06:36:01 localhost kernel: IPI shorthand broadcast: enabled
Feb 23 06:36:01 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 23 06:36:01 localhost kernel: AES CTR mode by8 optimization enabled
Feb 23 06:36:01 localhost kernel: sched_clock: Marking stable (728017336, 177516314)->(1032399249, -126865599)
Feb 23 06:36:01 localhost kernel: registered taskstats version 1
Feb 23 06:36:01 localhost kernel: Loading compiled-in X.509 certificates
Feb 23 06:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 23 06:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 23 06:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 23 06:36:01 localhost kernel: zswap: loaded using pool lzo/zbud
Feb 23 06:36:01 localhost kernel: page_owner is disabled
Feb 23 06:36:01 localhost kernel: Key type big_key registered
Feb 23 06:36:01 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 23 06:36:01 localhost kernel: Freeing initrd memory: 74232K
Feb 23 06:36:01 localhost kernel: Key type encrypted registered
Feb 23 06:36:01 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 23 06:36:01 localhost kernel: Loading compiled-in module X.509 certificates
Feb 23 06:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 23 06:36:01 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 23 06:36:01 localhost kernel: ima: No architecture policies found
Feb 23 06:36:01 localhost kernel: evm: Initialising EVM extended attributes:
Feb 23 06:36:01 localhost kernel: evm: security.selinux
Feb 23 06:36:01 localhost kernel: evm: security.SMACK64 (disabled)
Feb 23 06:36:01 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 23 06:36:01 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 23 06:36:01 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 23 06:36:01 localhost kernel: evm: security.apparmor (disabled)
Feb 23 06:36:01 localhost kernel: evm: security.ima
Feb 23 06:36:01 localhost kernel: evm: security.capability
Feb 23 06:36:01 localhost kernel: evm: HMAC attrs: 0x1
Feb 23 06:36:01 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 23 06:36:01 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 23 06:36:01 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 23 06:36:01 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 23 06:36:01 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 23 06:36:01 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 23 06:36:01 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 23 06:36:01 localhost kernel: Freeing unused decrypted memory: 2036K
Feb 23 06:36:01 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Feb 23 06:36:01 localhost kernel: Write protecting the kernel read-only data: 26624k
Feb 23 06:36:01 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Feb 23 06:36:01 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Feb 23 06:36:01 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 23 06:36:01 localhost kernel: Run /init as init process
Feb 23 06:36:01 localhost kernel:   with arguments:
Feb 23 06:36:01 localhost kernel:     /init
Feb 23 06:36:01 localhost kernel:   with environment:
Feb 23 06:36:01 localhost kernel:     HOME=/
Feb 23 06:36:01 localhost kernel:     TERM=linux
Feb 23 06:36:01 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Feb 23 06:36:01 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 23 06:36:01 localhost systemd[1]: Detected virtualization kvm.
Feb 23 06:36:01 localhost systemd[1]: Detected architecture x86-64.
Feb 23 06:36:01 localhost systemd[1]: Running in initrd.
Feb 23 06:36:01 localhost systemd[1]: No hostname configured, using default hostname.
Feb 23 06:36:01 localhost systemd[1]: Hostname set to <localhost>.
Feb 23 06:36:01 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 23 06:36:01 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 23 06:36:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 23 06:36:01 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 23 06:36:01 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 23 06:36:01 localhost systemd[1]: Reached target Local File Systems.
Feb 23 06:36:01 localhost systemd[1]: Reached target Path Units.
Feb 23 06:36:01 localhost systemd[1]: Reached target Slice Units.
Feb 23 06:36:01 localhost systemd[1]: Reached target Swaps.
Feb 23 06:36:01 localhost systemd[1]: Reached target Timer Units.
Feb 23 06:36:01 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 23 06:36:01 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 23 06:36:01 localhost systemd[1]: Listening on Journal Socket.
Feb 23 06:36:01 localhost systemd[1]: Listening on udev Control Socket.
Feb 23 06:36:01 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 23 06:36:01 localhost systemd[1]: Reached target Socket Units.
Feb 23 06:36:01 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 23 06:36:01 localhost systemd[1]: Starting Journal Service...
Feb 23 06:36:01 localhost systemd[1]: Starting Load Kernel Modules...
Feb 23 06:36:01 localhost systemd[1]: Starting Create System Users...
Feb 23 06:36:01 localhost systemd[1]: Starting Setup Virtual Console...
Feb 23 06:36:01 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 23 06:36:01 localhost systemd-journald[283]: Journal started
Feb 23 06:36:01 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/bdcaa433cfc7450a99abf0985ab59447) is 8.0M, max 314.7M, 306.7M free.
Feb 23 06:36:01 localhost systemd-modules-load[284]: Module 'msr' is built in
Feb 23 06:36:01 localhost systemd[1]: Started Journal Service.
Feb 23 06:36:01 localhost systemd[1]: Finished Load Kernel Modules.
Feb 23 06:36:01 localhost systemd[1]: Finished Setup Virtual Console.
Feb 23 06:36:01 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 23 06:36:01 localhost systemd[1]: Starting dracut cmdline hook...
Feb 23 06:36:01 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 23 06:36:01 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997.
Feb 23 06:36:01 localhost systemd-sysusers[285]: Creating group 'users' with GID 100.
Feb 23 06:36:01 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81.
Feb 23 06:36:01 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 23 06:36:01 localhost systemd[1]: Finished Create System Users.
Feb 23 06:36:01 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 23 06:36:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 23 06:36:01 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 23 06:36:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 23 06:36:01 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Feb 23 06:36:01 localhost dracut-cmdline[288]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 23 06:36:01 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 23 06:36:01 localhost systemd[1]: Finished dracut cmdline hook.
Feb 23 06:36:01 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 23 06:36:01 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 23 06:36:01 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 23 06:36:01 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Feb 23 06:36:01 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 23 06:36:01 localhost kernel: RPC: Registered udp transport module.
Feb 23 06:36:01 localhost kernel: RPC: Registered tcp transport module.
Feb 23 06:36:01 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 23 06:36:01 localhost rpc.statd[406]: Version 2.5.4 starting
Feb 23 06:36:01 localhost rpc.statd[406]: Initializing NSM state
Feb 23 06:36:01 localhost rpc.idmapd[411]: Setting log level to 0
Feb 23 06:36:01 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 23 06:36:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 23 06:36:01 localhost systemd-udevd[424]: Using default interface naming scheme 'rhel-9.0'.
Feb 23 06:36:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 23 06:36:01 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 23 06:36:01 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 23 06:36:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 23 06:36:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 23 06:36:01 localhost systemd[1]: Reached target System Initialization.
Feb 23 06:36:01 localhost systemd[1]: Reached target Basic System.
Feb 23 06:36:01 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 23 06:36:01 localhost systemd[1]: Reached target Network.
Feb 23 06:36:01 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 23 06:36:01 localhost systemd[1]: Starting dracut initqueue hook...
Feb 23 06:36:01 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Feb 23 06:36:01 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Feb 23 06:36:01 localhost kernel: GPT:20971519 != 838860799
Feb 23 06:36:01 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Feb 23 06:36:01 localhost kernel: GPT:20971519 != 838860799
Feb 23 06:36:01 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Feb 23 06:36:01 localhost kernel:  vda: vda1 vda2 vda3 vda4
Feb 23 06:36:01 localhost kernel: libata version 3.00 loaded.
Feb 23 06:36:01 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 23 06:36:01 localhost kernel: scsi host0: ata_piix
Feb 23 06:36:01 localhost kernel: scsi host1: ata_piix
Feb 23 06:36:01 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Feb 23 06:36:01 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Feb 23 06:36:02 localhost systemd-udevd[426]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 06:36:02 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 23 06:36:02 localhost systemd[1]: Reached target Initrd Root Device.
Feb 23 06:36:02 localhost kernel: ata1: found unknown device (class 0)
Feb 23 06:36:02 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 23 06:36:02 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 23 06:36:02 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 23 06:36:02 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 23 06:36:02 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 23 06:36:02 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 23 06:36:02 localhost systemd[1]: Finished dracut initqueue hook.
Feb 23 06:36:02 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 23 06:36:02 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 23 06:36:02 localhost systemd[1]: Reached target Remote File Systems.
Feb 23 06:36:02 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 23 06:36:02 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 23 06:36:02 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Feb 23 06:36:02 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Feb 23 06:36:02 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 23 06:36:02 localhost systemd[1]: Mounting /sysroot...
Feb 23 06:36:02 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 23 06:36:02 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Feb 23 06:36:02 localhost kernel: XFS (vda4): Ending clean mount
Feb 23 06:36:02 localhost systemd[1]: Mounted /sysroot.
Feb 23 06:36:02 localhost systemd[1]: Reached target Initrd Root File System.
Feb 23 06:36:02 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 23 06:36:02 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 23 06:36:02 localhost systemd[1]: Reached target Initrd File Systems.
Feb 23 06:36:02 localhost systemd[1]: Reached target Initrd Default Target.
Feb 23 06:36:02 localhost systemd[1]: Starting dracut mount hook...
Feb 23 06:36:02 localhost systemd[1]: Finished dracut mount hook.
Feb 23 06:36:02 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 23 06:36:02 localhost rpc.idmapd[411]: exiting on signal 15
Feb 23 06:36:02 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 23 06:36:02 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 23 06:36:02 localhost systemd[1]: Stopped target Network.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Timer Units.
Feb 23 06:36:02 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 23 06:36:02 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Basic System.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Path Units.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Remote File Systems.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Slice Units.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Socket Units.
Feb 23 06:36:02 localhost systemd[1]: Stopped target System Initialization.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Local File Systems.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Swaps.
Feb 23 06:36:02 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut mount hook.
Feb 23 06:36:02 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 23 06:36:02 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 23 06:36:02 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 23 06:36:02 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 23 06:36:02 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 23 06:36:02 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Load Kernel Modules.
Feb 23 06:36:02 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 23 06:36:02 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 23 06:36:02 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 23 06:36:02 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 23 06:36:02 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 23 06:36:02 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 23 06:36:02 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Closed udev Control Socket.
Feb 23 06:36:02 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Closed udev Kernel Socket.
Feb 23 06:36:02 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 23 06:36:02 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 23 06:36:02 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 23 06:36:02 localhost systemd[1]: Starting Cleanup udev Database...
Feb 23 06:36:03 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 23 06:36:03 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 23 06:36:03 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Stopped Create System Users.
Feb 23 06:36:03 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 23 06:36:03 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Finished Cleanup udev Database.
Feb 23 06:36:03 localhost systemd[1]: Reached target Switch Root.
Feb 23 06:36:03 localhost systemd[1]: Starting Switch Root...
Feb 23 06:36:03 localhost systemd[1]: Switching root.
Feb 23 06:36:03 localhost systemd-journald[283]: Journal stopped
Feb 23 06:36:03 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd).
Feb 23 06:36:03 localhost kernel: audit: type=1404 audit(1771828563.167:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability open_perms=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 06:36:03 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 06:36:03 localhost kernel: audit: type=1403 audit(1771828563.299:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 23 06:36:03 localhost systemd[1]: Successfully loaded SELinux policy in 137.357ms.
Feb 23 06:36:03 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 33.289ms.
Feb 23 06:36:03 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 23 06:36:03 localhost systemd[1]: Detected virtualization kvm.
Feb 23 06:36:03 localhost systemd[1]: Detected architecture x86-64.
Feb 23 06:36:03 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 06:36:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 06:36:03 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Stopped Switch Root.
Feb 23 06:36:03 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 23 06:36:03 localhost systemd[1]: Created slice Slice /system/getty.
Feb 23 06:36:03 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 23 06:36:03 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 23 06:36:03 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 23 06:36:03 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Feb 23 06:36:03 localhost systemd[1]: Created slice User and Session Slice.
Feb 23 06:36:03 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 23 06:36:03 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 23 06:36:03 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 23 06:36:03 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 23 06:36:03 localhost systemd[1]: Stopped target Switch Root.
Feb 23 06:36:03 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 23 06:36:03 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 23 06:36:03 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 23 06:36:03 localhost systemd[1]: Reached target Path Units.
Feb 23 06:36:03 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 23 06:36:03 localhost systemd[1]: Reached target Slice Units.
Feb 23 06:36:03 localhost systemd[1]: Reached target Swaps.
Feb 23 06:36:03 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 23 06:36:03 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 23 06:36:03 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 23 06:36:03 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 23 06:36:03 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 23 06:36:03 localhost systemd[1]: Listening on udev Control Socket.
Feb 23 06:36:03 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 23 06:36:03 localhost systemd[1]: Mounting Huge Pages File System...
Feb 23 06:36:03 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 23 06:36:03 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 23 06:36:03 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 23 06:36:03 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 23 06:36:03 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 23 06:36:03 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 23 06:36:03 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 23 06:36:03 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 23 06:36:03 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 23 06:36:03 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 23 06:36:03 localhost systemd[1]: Stopped Journal Service.
Feb 23 06:36:03 localhost kernel: fuse: init (API version 7.36)
Feb 23 06:36:03 localhost systemd[1]: Starting Journal Service...
Feb 23 06:36:03 localhost systemd[1]: Starting Load Kernel Modules...
Feb 23 06:36:03 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 23 06:36:03 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 23 06:36:03 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 23 06:36:03 localhost systemd-journald[618]: Journal started
Feb 23 06:36:03 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 8.0M, max 314.7M, 306.7M free.
Feb 23 06:36:03 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 23 06:36:03 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 23 06:36:03 localhost systemd-modules-load[619]: Module 'msr' is built in
Feb 23 06:36:04 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 23 06:36:04 localhost systemd[1]: Started Journal Service.
Feb 23 06:36:04 localhost systemd[1]: Mounted Huge Pages File System.
Feb 23 06:36:04 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 23 06:36:04 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 23 06:36:04 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 23 06:36:04 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 23 06:36:04 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 23 06:36:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 23 06:36:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 23 06:36:04 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 23 06:36:04 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 23 06:36:04 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 23 06:36:04 localhost kernel: ACPI: bus type drm_connector registered
Feb 23 06:36:04 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 23 06:36:04 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 23 06:36:04 localhost systemd[1]: Finished Load Kernel Modules.
Feb 23 06:36:04 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 23 06:36:04 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 23 06:36:04 localhost systemd[1]: Mounting FUSE Control File System...
Feb 23 06:36:04 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 23 06:36:04 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 23 06:36:04 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 23 06:36:04 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 23 06:36:04 localhost systemd[1]: Starting Load/Save Random Seed...
Feb 23 06:36:04 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 23 06:36:04 localhost systemd[1]: Starting Create System Users...
Feb 23 06:36:04 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 8.0M, max 314.7M, 306.7M free.
Feb 23 06:36:04 localhost systemd-journald[618]: Received client request to flush runtime journal.
Feb 23 06:36:04 localhost systemd[1]: Mounted FUSE Control File System.
Feb 23 06:36:04 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 23 06:36:04 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 23 06:36:04 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 23 06:36:04 localhost systemd[1]: Finished Load/Save Random Seed.
Feb 23 06:36:04 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 23 06:36:04 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 23 06:36:04 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989.
Feb 23 06:36:04 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988.
Feb 23 06:36:04 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Feb 23 06:36:04 localhost systemd[1]: Finished Create System Users.
Feb 23 06:36:04 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 23 06:36:04 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 23 06:36:04 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 23 06:36:04 localhost systemd[1]: Set up automount EFI System Partition Automount.
Feb 23 06:36:04 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 23 06:36:04 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 23 06:36:04 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Feb 23 06:36:04 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 23 06:36:04 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 23 06:36:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 23 06:36:04 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 23 06:36:04 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 23 06:36:04 localhost systemd-udevd[639]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 06:36:04 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Feb 23 06:36:04 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Feb 23 06:36:04 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Feb 23 06:36:04 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 23 06:36:04 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 23 06:36:04 localhost systemd-fsck[681]: fsck.fat 4.2 (2021-01-31)
Feb 23 06:36:04 localhost systemd-fsck[681]: /dev/vda2: 12 files, 1782/51145 clusters
Feb 23 06:36:04 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Feb 23 06:36:04 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 23 06:36:04 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 23 06:36:04 localhost kernel: Console: switching to colour dummy device 80x25
Feb 23 06:36:04 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 23 06:36:04 localhost kernel: [drm] features: -context_init
Feb 23 06:36:04 localhost kernel: [drm] number of scanouts: 1
Feb 23 06:36:04 localhost kernel: [drm] number of cap sets: 0
Feb 23 06:36:04 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Feb 23 06:36:04 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Feb 23 06:36:04 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 23 06:36:04 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 23 06:36:04 localhost kernel: SVM: TSC scaling supported
Feb 23 06:36:04 localhost kernel: kvm: Nested Virtualization enabled
Feb 23 06:36:04 localhost kernel: SVM: kvm: Nested Paging enabled
Feb 23 06:36:04 localhost kernel: SVM: LBR virtualization supported
Feb 23 06:36:05 localhost systemd[1]: Mounting /boot...
Feb 23 06:36:05 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Feb 23 06:36:05 localhost kernel: XFS (vda3): Ending clean mount
Feb 23 06:36:05 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Feb 23 06:36:05 localhost systemd[1]: Mounted /boot.
Feb 23 06:36:05 localhost systemd[1]: Mounting /boot/efi...
Feb 23 06:36:05 localhost systemd[1]: Mounted /boot/efi.
Feb 23 06:36:05 localhost systemd[1]: Reached target Local File Systems.
Feb 23 06:36:05 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 23 06:36:05 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 23 06:36:05 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 23 06:36:05 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 06:36:05 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 23 06:36:05 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 23 06:36:05 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 23 06:36:05 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 716 (bootctl)
Feb 23 06:36:05 localhost systemd[1]: Starting File System Check on /dev/vda2...
Feb 23 06:36:05 localhost systemd[1]: Finished File System Check on /dev/vda2.
Feb 23 06:36:05 localhost systemd[1]: Mounting EFI System Partition Automount...
Feb 23 06:36:05 localhost systemd[1]: Mounted EFI System Partition Automount.
Feb 23 06:36:05 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 23 06:36:05 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 23 06:36:05 localhost systemd[1]: Starting Security Auditing Service...
Feb 23 06:36:05 localhost systemd[1]: Starting RPC Bind...
Feb 23 06:36:05 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 23 06:36:05 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Feb 23 06:36:05 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Feb 23 06:36:05 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 23 06:36:05 localhost systemd[1]: Started RPC Bind.
Feb 23 06:36:05 localhost augenrules[730]: /sbin/augenrules: No change
Feb 23 06:36:05 localhost augenrules[740]: No rules
Feb 23 06:36:05 localhost augenrules[740]: enabled 1
Feb 23 06:36:05 localhost augenrules[740]: failure 1
Feb 23 06:36:05 localhost augenrules[740]: pid 725
Feb 23 06:36:05 localhost augenrules[740]: rate_limit 0
Feb 23 06:36:05 localhost augenrules[740]: backlog_limit 8192
Feb 23 06:36:05 localhost augenrules[740]: lost 0
Feb 23 06:36:05 localhost augenrules[740]: backlog 0
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time 60000
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 23 06:36:05 localhost augenrules[740]: enabled 1
Feb 23 06:36:05 localhost augenrules[740]: failure 1
Feb 23 06:36:05 localhost augenrules[740]: pid 725
Feb 23 06:36:05 localhost augenrules[740]: rate_limit 0
Feb 23 06:36:05 localhost augenrules[740]: backlog_limit 8192
Feb 23 06:36:05 localhost augenrules[740]: lost 0
Feb 23 06:36:05 localhost augenrules[740]: backlog 3
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time 60000
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 23 06:36:05 localhost augenrules[740]: enabled 1
Feb 23 06:36:05 localhost augenrules[740]: failure 1
Feb 23 06:36:05 localhost augenrules[740]: pid 725
Feb 23 06:36:05 localhost augenrules[740]: rate_limit 0
Feb 23 06:36:05 localhost augenrules[740]: backlog_limit 8192
Feb 23 06:36:05 localhost augenrules[740]: lost 0
Feb 23 06:36:05 localhost augenrules[740]: backlog 0
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time 60000
Feb 23 06:36:05 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 23 06:36:05 localhost systemd[1]: Started Security Auditing Service.
Feb 23 06:36:05 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 23 06:36:05 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 23 06:36:05 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 23 06:36:05 localhost systemd[1]: Starting Update is Completed...
Feb 23 06:36:05 localhost systemd[1]: Finished Update is Completed.
Feb 23 06:36:05 localhost systemd[1]: Reached target System Initialization.
Feb 23 06:36:05 localhost systemd[1]: Started dnf makecache --timer.
Feb 23 06:36:05 localhost systemd[1]: Started Daily rotation of log files.
Feb 23 06:36:05 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 23 06:36:05 localhost systemd[1]: Reached target Timer Units.
Feb 23 06:36:05 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 23 06:36:05 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 23 06:36:05 localhost systemd[1]: Reached target Socket Units.
Feb 23 06:36:05 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Feb 23 06:36:05 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 23 06:36:05 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 06:36:05 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 23 06:36:05 localhost systemd[1]: Reached target Basic System.
Feb 23 06:36:05 localhost dbus-broker-lau[750]: Ready
Feb 23 06:36:05 localhost systemd[1]: Starting NTP client/server...
Feb 23 06:36:05 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 23 06:36:05 localhost systemd[1]: Started irqbalance daemon.
Feb 23 06:36:05 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 23 06:36:05 localhost systemd[1]: Starting System Logging Service...
Feb 23 06:36:05 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 06:36:05 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 06:36:05 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 06:36:05 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 23 06:36:05 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 23 06:36:05 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 23 06:36:05 localhost systemd[1]: Starting User Login Management...
Feb 23 06:36:05 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 23 06:36:05 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start
Feb 23 06:36:05 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Feb 23 06:36:05 localhost systemd[1]: Started System Logging Service.
Feb 23 06:36:05 localhost systemd-logind[759]: New seat seat0.
Feb 23 06:36:05 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 23 06:36:05 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 23 06:36:05 localhost systemd[1]: Started User Login Management.
Feb 23 06:36:05 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 06:36:05 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data
Feb 23 06:36:05 localhost chronyd[765]: Loaded seccomp filter (level 2)
Feb 23 06:36:05 localhost systemd[1]: Started NTP client/server.
Feb 23 06:36:05 localhost rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 06:36:06 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Mon, 23 Feb 2026 06:36:06 +0000. Up 6.32 seconds.
Feb 23 06:36:06 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 23 06:36:06 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 23 06:36:06 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp4o17gt98.mount: Deactivated successfully.
Feb 23 06:36:06 localhost systemd[1]: Starting Hostname Service...
Feb 23 06:36:06 localhost systemd[1]: Started Hostname Service.
Feb 23 06:36:06 np0005626463.novalocal systemd-hostnamed[783]: Hostname set to <np0005626463.novalocal> (static)
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Reached target Preparation for Network.
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Network Manager...
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7061] NetworkManager (version 1.42.2-1.el9) is starting... (boot:7e1679c6-ea6b-4cb0-813d-ca6f65e53cae)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7066] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Started Network Manager.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7113] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Reached target Network.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7205] manager[0x55ed99fff020]: monitoring kernel firmware directory '/lib/firmware'.
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7241] hostname: hostname: using hostnamed
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7241] hostname: static hostname changed from (none) to "np0005626463.novalocal"
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7253] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7384] manager[0x55ed99fff020]: rfkill: Wi-Fi hardware radio set enabled
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7385] manager[0x55ed99fff020]: rfkill: WWAN hardware radio set enabled
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7494] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7495] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7506] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7507] manager: Networking is enabled by state file
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7558] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7559] settings: Loaded settings plugin: keyfile (internal)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7585] dhcp: init: Using DHCP client 'internal'
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7587] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7596] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7602] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7607] device (lo): Activation: starting connection 'lo' (8bdfeccc-b3ac-4c33-8351-8677ac367e4c)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7614] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7616] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7650] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7654] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7655] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7657] device (eth0): carrier: link connected
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7660] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7666] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Reached target NFS client services.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7698] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7702] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7703] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7706] manager: NetworkManager state is now CONNECTING
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7708] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7718] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Reached target Remote File Systems.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7786] dhcp4 (eth0): state changed new lease, address=38.102.83.164
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7790] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7812] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7988] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7991] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.7995] device (lo): Activation: successful, device activated.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8000] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8002] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8004] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8006] device (eth0): Activation: successful, device activated.
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8010] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 23 06:36:06 np0005626463.novalocal NetworkManager[788]: <info>  [1771828566.8012] manager: startup complete
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 23 06:36:06 np0005626463.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: Cloud-init v. 22.1-9.el9 running 'init' at Mon, 23 Feb 2026 06:36:07 +0000. Up 7.20 seconds.
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |  eth0  | True |        38.102.83.164         | 255.255.255.0 | global | fa:16:3e:9a:b6:c6 |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |  eth0  | True | fe80::f816:3eff:fe9a:b6c6/64 |       .       |  link  | fa:16:3e:9a:b6:c6 |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 23 06:36:07 np0005626463.novalocal cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 23 06:36:07 np0005626463.novalocal systemd[1]: Starting Authorization Manager...
Feb 23 06:36:07 np0005626463.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 06:36:07 np0005626463.novalocal polkitd[1033]: Started polkitd version 0.117
Feb 23 06:36:07 np0005626463.novalocal polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 06:36:07 np0005626463.novalocal polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 06:36:07 np0005626463.novalocal polkitd[1033]: Finished loading, compiling and executing 4 rules
Feb 23 06:36:07 np0005626463.novalocal systemd[1]: Started Authorization Manager.
Feb 23 06:36:07 np0005626463.novalocal polkitd[1033]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: new group: name=cloud-user, GID=1001
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: add 'cloud-user' to group 'adm'
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: add 'cloud-user' to group 'systemd-journal'
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: add 'cloud-user' to shadow group 'adm'
Feb 23 06:36:09 np0005626463.novalocal useradd[1118]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Generating public/private rsa key pair.
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key fingerprint is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: SHA256:pdNZ75QGpUNpcq8gaslC+2P1B6YHH1UKWMec5bQ/qws root@np0005626463.novalocal
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key's randomart image is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +---[RSA 3072]----+
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |         o.oo++  |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |        . ooB=.. |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |          .==+o  |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |    .   .+.oo+.o |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |   . o oS.oo .=..|
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |    o = o.+ .+  o|
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |     + . * E  .. |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |      + . + o .  |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |     . . . . o.  |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +----[SHA256]-----+
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Generating public/private ecdsa key pair.
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key fingerprint is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: SHA256:OD3aGn6FewPNOJKfrmSp+eMwt9pK7oB2rTS1kgG+Y2g root@np0005626463.novalocal
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key's randomart image is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +---[ECDSA 256]---+
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |                 |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |                 |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: | .               |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |. .    o         |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: | . . .o.S=       |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |... = +==.+      |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |oE.*+oBo.*       |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |+ +++@.+= o      |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |   oB=O=.. .     |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +----[SHA256]-----+
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Generating public/private ed25519 key pair.
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key fingerprint is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: SHA256:1oXKoehgbeTcKm6BaoZ08cczaBK1XI/qMeTUf9gvmKw root@np0005626463.novalocal
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: The key's randomart image is:
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +--[ED25519 256]--+
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |                 |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |     . .   .     |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |    + + + . .    |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |   B B = = .     |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: | .o & * S +      |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |.o.* X * o o     |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |+ o.* + + + .    |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |o+.. .   + . .   |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: |o..    E.   .    |
Feb 23 06:36:10 np0005626463.novalocal cloud-init[951]: +----[SHA256]-----+
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Reached target Network is Online.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 23 06:36:10 np0005626463.novalocal sm-notify[1131]: Version 2.5.4 starting
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Permit User Sessions...
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Finished Permit User Sessions.
Feb 23 06:36:10 np0005626463.novalocal sshd[1132]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Started Command Scheduler.
Feb 23 06:36:10 np0005626463.novalocal sshd[1132]: Server listening on 0.0.0.0 port 22.
Feb 23 06:36:10 np0005626463.novalocal sshd[1132]: Server listening on :: port 22.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Started Getty on tty1.
Feb 23 06:36:10 np0005626463.novalocal crond[1134]: (CRON) STARTUP (1.5.7)
Feb 23 06:36:10 np0005626463.novalocal crond[1134]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 23 06:36:10 np0005626463.novalocal crond[1134]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 14% if used.)
Feb 23 06:36:10 np0005626463.novalocal crond[1134]: (CRON) INFO (running with inotify support)
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Reached target Login Prompts.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Reached target Multi-User System.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 23 06:36:10 np0005626463.novalocal sshd[1146]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1163]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1163]: Unable to negotiate with 38.102.83.114 port 55210: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1176]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal kdumpctl[1137]: kdump: No kdump initial ramdisk found.
Feb 23 06:36:10 np0005626463.novalocal kdumpctl[1137]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Feb 23 06:36:10 np0005626463.novalocal sshd[1176]: Connection reset by 38.102.83.114 port 55212 [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1193]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1193]: Unable to negotiate with 38.102.83.114 port 55222: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1201]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1201]: Unable to negotiate with 38.102.83.114 port 55232: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1146]: Connection closed by 38.102.83.114 port 55206 [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1220]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1247]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1265]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal cloud-init[1266]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Mon, 23 Feb 2026 06:36:10 +0000. Up 10.67 seconds.
Feb 23 06:36:10 np0005626463.novalocal sshd[1265]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:36:10 np0005626463.novalocal sshd[1278]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:10 np0005626463.novalocal sshd[1278]: Unable to negotiate with 38.102.83.114 port 55262: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1220]: Connection closed by 38.102.83.114 port 55238 [preauth]
Feb 23 06:36:10 np0005626463.novalocal sshd[1247]: Connection closed by 38.102.83.114 port 55254 [preauth]
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Feb 23 06:36:10 np0005626463.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Feb 23 06:36:10 np0005626463.novalocal dracut[1434]: dracut-057-21.git20230214.el9
Feb 23 06:36:10 np0005626463.novalocal cloud-init[1452]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Mon, 23 Feb 2026 06:36:10 +0000. Up 11.06 seconds.
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1457]: #############################################################
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1464]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1472]: 256 SHA256:OD3aGn6FewPNOJKfrmSp+eMwt9pK7oB2rTS1kgG+Y2g root@np0005626463.novalocal (ECDSA)
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1479]: 256 SHA256:1oXKoehgbeTcKm6BaoZ08cczaBK1XI/qMeTUf9gvmKw root@np0005626463.novalocal (ED25519)
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1483]: 3072 SHA256:pdNZ75QGpUNpcq8gaslC+2P1B6YHH1UKWMec5bQ/qws root@np0005626463.novalocal (RSA)
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1486]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1490]: #############################################################
Feb 23 06:36:11 np0005626463.novalocal cloud-init[1452]: Cloud-init v. 22.1-9.el9 finished at Mon, 23 Feb 2026 06:36:11 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.32 seconds
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 23 06:36:11 np0005626463.novalocal systemd[1]: Reloading Network Manager...
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 23 06:36:11 np0005626463.novalocal NetworkManager[788]: <info>  [1771828571.2636] audit: op="reload" arg="0" pid=1592 uid=0 result="success"
Feb 23 06:36:11 np0005626463.novalocal NetworkManager[788]: <info>  [1771828571.2645] config: signal: SIGHUP (no changes from disk)
Feb 23 06:36:11 np0005626463.novalocal systemd[1]: Reloaded Network Manager.
Feb 23 06:36:11 np0005626463.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 23 06:36:11 np0005626463.novalocal systemd[1]: Reached target Cloud-init target.
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: memstrack is not available
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 23 06:36:11 np0005626463.novalocal chronyd[765]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org)
Feb 23 06:36:11 np0005626463.novalocal chronyd[765]: System clock TAI offset set to 37 seconds
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 23 06:36:11 np0005626463.novalocal dracut[1436]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: memstrack is not available
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: *** Including module: systemd ***
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: *** Including module: systemd-initrd ***
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: *** Including module: i18n ***
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: No KEYMAP configured.
Feb 23 06:36:12 np0005626463.novalocal dracut[1436]: *** Including module: drm ***
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]: *** Including module: prefixdevname ***
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]: *** Including module: kernel-modules ***
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]: *** Including module: kernel-modules-extra ***
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 23 06:36:13 np0005626463.novalocal dracut[1436]: *** Including module: qemu ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: fstab-sys ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: rootfs-block ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: terminfo ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: udev-rules ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: Skipping udev rule: 91-permissions.rules
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: virtiofs ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: dracut-systemd ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: usrmount ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: base ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: fs-lib ***
Feb 23 06:36:14 np0005626463.novalocal dracut[1436]: *** Including module: kdumpbase ***
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:   microcode_ctl module: mangling fw_dir
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]: *** Including module: shutdown ***
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]: *** Including module: squash ***
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]: *** Including modules done ***
Feb 23 06:36:15 np0005626463.novalocal dracut[1436]: *** Installing kernel module dependencies ***
Feb 23 06:36:16 np0005626463.novalocal dracut[1436]: *** Installing kernel module dependencies done ***
Feb 23 06:36:16 np0005626463.novalocal dracut[1436]: *** Resolving executable dependencies ***
Feb 23 06:36:16 np0005626463.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Resolving executable dependencies done ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Hardlinking files ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Mode:           real
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Files:          1099
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Linked:         3 files
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Compared:       0 xattrs
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Compared:       373 files
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Saved:          61.04 KiB
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Duration:       0.035691 seconds
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Hardlinking files done ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Could not find 'strip'. Not stripping the initramfs.
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Generating early-microcode cpio image ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Constructing AuthenticAMD.bin ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: *** Store current command line parameters ***
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: Stored kernel commandline:
Feb 23 06:36:17 np0005626463.novalocal dracut[1436]: No dracut internal kernel commandline stored in the initramfs
Feb 23 06:36:18 np0005626463.novalocal dracut[1436]: *** Install squash loader ***
Feb 23 06:36:18 np0005626463.novalocal dracut[1436]: *** Squashing the files inside the initramfs ***
Feb 23 06:36:19 np0005626463.novalocal dracut[1436]: *** Squashing the files inside the initramfs done ***
Feb 23 06:36:19 np0005626463.novalocal dracut[1436]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Feb 23 06:36:19 np0005626463.novalocal dracut[1436]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Feb 23 06:36:20 np0005626463.novalocal kdumpctl[1137]: kdump: kexec: loaded kdump kernel
Feb 23 06:36:20 np0005626463.novalocal kdumpctl[1137]: kdump: Starting kdump: [OK]
Feb 23 06:36:20 np0005626463.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 23 06:36:20 np0005626463.novalocal systemd[1]: Startup finished in 1.320s (kernel) + 2.026s (initrd) + 17.208s (userspace) = 20.555s.
Feb 23 06:36:32 np0005626463.novalocal sshd[4172]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:36:32 np0005626463.novalocal sshd[4172]: Accepted publickey for zuul from 38.102.83.114 port 59642 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 23 06:36:32 np0005626463.novalocal systemd-logind[759]: New session 1 of user zuul.
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Queued start job for default target Main User Target.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Created slice User Application Slice.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Reached target Paths.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Reached target Timers.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Starting D-Bus User Message Bus Socket...
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Starting Create User's Volatile Files and Directories...
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Finished Create User's Volatile Files and Directories.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Listening on D-Bus User Message Bus Socket.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Reached target Sockets.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Reached target Basic System.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Reached target Main User Target.
Feb 23 06:36:32 np0005626463.novalocal systemd[4176]: Startup finished in 111ms.
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 23 06:36:32 np0005626463.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 23 06:36:32 np0005626463.novalocal sshd[4172]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:36:33 np0005626463.novalocal python3[4228]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 06:36:36 np0005626463.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 06:36:42 np0005626463.novalocal python3[4248]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 06:36:50 np0005626463.novalocal python3[4302]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 06:36:51 np0005626463.novalocal python3[4332]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 23 06:36:54 np0005626463.novalocal python3[4348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:36:55 np0005626463.novalocal python3[4362]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:36:56 np0005626463.novalocal python3[4421]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:36:56 np0005626463.novalocal python3[4462]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828616.28678-387-259426370767621/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa follow=False checksum=3856428e4c0cdf708f3b02cf6f4769559d121f25 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:36:58 np0005626463.novalocal python3[4535]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:36:58 np0005626463.novalocal python3[4576]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828618.2974393-485-158436035931816/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa.pub follow=False checksum=24c5085c987d798738c880bb8143c9f9cd19ae33 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:17 np0005626463.novalocal chronyd[765]: Selected source 167.160.187.179 (2.rhel.pool.ntp.org)
Feb 23 06:37:37 np0005626463.novalocal python3[4605]: ansible-ping Invoked with data=pong
Feb 23 06:37:39 np0005626463.novalocal python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 06:37:43 np0005626463.novalocal python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 23 06:37:45 np0005626463.novalocal python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:45 np0005626463.novalocal python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:45 np0005626463.novalocal python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:47 np0005626463.novalocal python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:47 np0005626463.novalocal python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:47 np0005626463.novalocal python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:49 np0005626463.novalocal sudo[4778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnuuhnbqmpbmqpyxkzqxzlhkedufezqb ; /usr/bin/python3
Feb 23 06:37:49 np0005626463.novalocal sudo[4778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:37:50 np0005626463.novalocal python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:50 np0005626463.novalocal sudo[4778]: pam_unix(sudo:session): session closed for user root
Feb 23 06:37:51 np0005626463.novalocal sudo[4826]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhkrihwvxwrqidzeiwhawnqbqilqinau ; /usr/bin/python3
Feb 23 06:37:51 np0005626463.novalocal sudo[4826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:37:51 np0005626463.novalocal python3[4828]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:37:51 np0005626463.novalocal sudo[4826]: pam_unix(sudo:session): session closed for user root
Feb 23 06:37:51 np0005626463.novalocal sudo[4869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcofyehfjuckfxjyfzrawunwyeeeflwa ; /usr/bin/python3
Feb 23 06:37:51 np0005626463.novalocal sudo[4869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:37:51 np0005626463.novalocal python3[4871]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771828671.2630305-95-108966421959354/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:37:51 np0005626463.novalocal sudo[4869]: pam_unix(sudo:session): session closed for user root
Feb 23 06:37:58 np0005626463.novalocal python3[4899]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:37:59 np0005626463.novalocal python3[4913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:37:59 np0005626463.novalocal python3[4927]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:37:59 np0005626463.novalocal python3[4941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:00 np0005626463.novalocal python3[4955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:00 np0005626463.novalocal python3[4969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:00 np0005626463.novalocal python3[4983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:00 np0005626463.novalocal python3[4997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:01 np0005626463.novalocal python3[5011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:01 np0005626463.novalocal python3[5025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:01 np0005626463.novalocal python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:01 np0005626463.novalocal python3[5053]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:02 np0005626463.novalocal python3[5067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:02 np0005626463.novalocal python3[5081]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:02 np0005626463.novalocal python3[5095]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:02 np0005626463.novalocal python3[5109]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:03 np0005626463.novalocal python3[5123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:03 np0005626463.novalocal python3[5137]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:03 np0005626463.novalocal python3[5151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:03 np0005626463.novalocal python3[5165]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:04 np0005626463.novalocal python3[5179]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:04 np0005626463.novalocal python3[5193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:04 np0005626463.novalocal python3[5207]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:04 np0005626463.novalocal python3[5221]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:05 np0005626463.novalocal python3[5235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:05 np0005626463.novalocal python3[5249]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:38:07 np0005626463.novalocal sudo[5264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esgkccmixamoucbwbhpjhogqclglaesu ; /usr/bin/python3
Feb 23 06:38:07 np0005626463.novalocal sudo[5264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:07 np0005626463.novalocal python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 23 06:38:07 np0005626463.novalocal systemd[1]: Starting Time & Date Service...
Feb 23 06:38:07 np0005626463.novalocal systemd[1]: Started Time & Date Service.
Feb 23 06:38:07 np0005626463.novalocal systemd-timedated[5268]: Changed time zone to 'UTC' (UTC).
Feb 23 06:38:07 np0005626463.novalocal sudo[5264]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:08 np0005626463.novalocal sudo[5285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikjenyospecfefckncppeftxwogfpkio ; /usr/bin/python3
Feb 23 06:38:08 np0005626463.novalocal sudo[5285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:09 np0005626463.novalocal python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:09 np0005626463.novalocal sudo[5285]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:10 np0005626463.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:38:10 np0005626463.novalocal python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771828689.974417-491-24273867821658/source _original_basename=tmp27944crw follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:11 np0005626463.novalocal python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:38:11 np0005626463.novalocal python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771828691.467247-584-83037043803092/source _original_basename=tmp_9pjbo_8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:13 np0005626463.novalocal sudo[5535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucwffhyihjroabirxqwcnpmtreowpuhv ; /usr/bin/python3
Feb 23 06:38:13 np0005626463.novalocal sudo[5535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:13 np0005626463.novalocal python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:38:13 np0005626463.novalocal sudo[5535]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:14 np0005626463.novalocal sudo[5578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oklnmpbydssnxodfpcdvjcmxcuxxrjvu ; /usr/bin/python3
Feb 23 06:38:14 np0005626463.novalocal sudo[5578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:14 np0005626463.novalocal python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771828693.6696973-723-110828000767333/source _original_basename=tmpdlis6age follow=False checksum=9313104c4584898a1afe992edc322b557e0f1f28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:14 np0005626463.novalocal sudo[5578]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:15 np0005626463.novalocal python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:38:15 np0005626463.novalocal python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:38:16 np0005626463.novalocal sudo[5672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erzmqjgmbwaadhfoucpzjctoltljwnlf ; /usr/bin/python3
Feb 23 06:38:16 np0005626463.novalocal sudo[5672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:16 np0005626463.novalocal python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:38:16 np0005626463.novalocal sudo[5672]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:16 np0005626463.novalocal sudo[5715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynwsmjbalpaoajiozdfgduycfvgvizpj ; /usr/bin/python3
Feb 23 06:38:16 np0005626463.novalocal sudo[5715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:17 np0005626463.novalocal python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771828696.5313103-852-14575786188836/source _original_basename=tmpi06nmjml follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:17 np0005626463.novalocal sudo[5715]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:18 np0005626463.novalocal sudo[5746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voxycqtoditfjxcznfhpabzsemvgtlav ; /usr/bin/python3
Feb 23 06:38:18 np0005626463.novalocal sudo[5746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:38:18 np0005626463.novalocal python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-16c2-0802-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:38:18 np0005626463.novalocal sudo[5746]: pam_unix(sudo:session): session closed for user root
Feb 23 06:38:29 np0005626463.novalocal python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-16c2-0802-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 23 06:38:37 np0005626463.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 06:38:41 np0005626463.novalocal python3[5786]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:38:47 np0005626463.novalocal systemd[4176]: Starting Mark boot as successful...
Feb 23 06:38:47 np0005626463.novalocal systemd[4176]: Finished Mark boot as successful.
Feb 23 06:39:00 np0005626463.novalocal sudo[5802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keiogcgipplakscztzjjzgnokypkhsdx ; /usr/bin/python3
Feb 23 06:39:00 np0005626463.novalocal sudo[5802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:39:00 np0005626463.novalocal python3[5804]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:39:00 np0005626463.novalocal sudo[5802]: pam_unix(sudo:session): session closed for user root
Feb 23 06:40:00 np0005626463.novalocal sshd[4185]: Received disconnect from 38.102.83.114 port 59642:11: disconnected by user
Feb 23 06:40:00 np0005626463.novalocal sshd[4185]: Disconnected from user zuul 38.102.83.114 port 59642
Feb 23 06:40:00 np0005626463.novalocal sshd[4172]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:40:00 np0005626463.novalocal systemd-logind[759]: Session 1 logged out. Waiting for processes to exit.
Feb 23 06:40:07 np0005626463.novalocal sshd[5807]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:40:07 np0005626463.novalocal sshd[5807]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 06:40:07 np0005626463.novalocal sshd[5807]: Connection closed by 178.128.89.99 port 51904
Feb 23 06:40:46 np0005626463.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Feb 23 06:40:46 np0005626463.novalocal systemd[1]: efi.mount: Deactivated successfully.
Feb 23 06:40:46 np0005626463.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Feb 23 06:41:47 np0005626463.novalocal systemd[4176]: Created slice User Background Tasks Slice.
Feb 23 06:41:47 np0005626463.novalocal systemd[4176]: Starting Cleanup of User's Temporary Files and Directories...
Feb 23 06:41:47 np0005626463.novalocal systemd[4176]: Finished Cleanup of User's Temporary Files and Directories.
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Feb 23 06:42:08 np0005626463.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Feb 23 06:42:08 np0005626463.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.6815] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 23 06:42:08 np0005626463.novalocal systemd-udevd[5813]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.6968] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7007] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7014] device (eth1): carrier: link connected
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7017] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7023] policy: auto-activating connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb)
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7030] device (eth1): Activation: starting connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb)
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7031] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7036] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7042] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 23 06:42:08 np0005626463.novalocal NetworkManager[788]: <info>  [1771828928.7048] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 06:42:09 np0005626463.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Feb 23 06:42:10 np0005626463.novalocal sshd[5816]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:42:10 np0005626463.novalocal sshd[5816]: Accepted publickey for zuul from 38.102.83.114 port 48796 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:42:10 np0005626463.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 23 06:42:10 np0005626463.novalocal systemd-logind[759]: New session 3 of user zuul.
Feb 23 06:42:10 np0005626463.novalocal sshd[5816]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:42:10 np0005626463.novalocal python3[5833]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-116e-582b-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:42:23 np0005626463.novalocal sudo[5881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roojdpmdpuzchdoifbmgtkbnzvvmkxns ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 23 06:42:23 np0005626463.novalocal sudo[5881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:42:23 np0005626463.novalocal python3[5883]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:42:23 np0005626463.novalocal sudo[5881]: pam_unix(sudo:session): session closed for user root
Feb 23 06:42:23 np0005626463.novalocal sudo[5924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heuhgyvzcfvwsyngshxqpzjzuypsgyck ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 23 06:42:23 np0005626463.novalocal sudo[5924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:42:24 np0005626463.novalocal python3[5926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828943.449911-435-249364340139678/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=9dbfb9b07f02d8db06baa922059ec27b6663d592 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:42:24 np0005626463.novalocal sudo[5924]: pam_unix(sudo:session): session closed for user root
Feb 23 06:42:24 np0005626463.novalocal sudo[5954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkogkonhnioqfgsgtzwrmboychtvhabq ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 23 06:42:24 np0005626463.novalocal sudo[5954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:42:24 np0005626463.novalocal python3[5956]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Stopping Network Manager...
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7055] caught SIGTERM, shutting down normally.
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7169] dhcp4 (eth0): canceled DHCP transaction
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7170] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7170] dhcp4 (eth0): state changed no lease
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7174] manager: NetworkManager state is now CONNECTING
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7229] dhcp4 (eth1): canceled DHCP transaction
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7230] dhcp4 (eth1): state changed no lease
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[788]: <info>  [1771828944.7491] exiting (success)
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Stopped Network Manager.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: NetworkManager.service: Consumed 1.839s CPU time.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Starting Network Manager...
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.7922] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:7e1679c6-ea6b-4cb0-813d-ca6f65e53cae)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.7924] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.7942] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Started Network Manager.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8006] manager[0x5640e0c73090]: monitoring kernel firmware directory '/lib/firmware'.
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Starting Hostname Service...
Feb 23 06:42:24 np0005626463.novalocal sudo[5954]: pam_unix(sudo:session): session closed for user root
Feb 23 06:42:24 np0005626463.novalocal systemd[1]: Started Hostname Service.
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8804] hostname: hostname: using hostnamed
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8804] hostname: static hostname changed from (none) to "np0005626463.novalocal"
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8812] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8819] manager[0x5640e0c73090]: rfkill: Wi-Fi hardware radio set enabled
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8819] manager[0x5640e0c73090]: rfkill: WWAN hardware radio set enabled
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8861] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8862] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8863] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8864] manager: Networking is enabled by state file
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8873] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8873] settings: Loaded settings plugin: keyfile (internal)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8934] dhcp: init: Using DHCP client 'internal'
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8938] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8946] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8954] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8967] device (lo): Activation: starting connection 'lo' (8bdfeccc-b3ac-4c33-8351-8677ac367e4c)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8977] device (eth0): carrier: link connected
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8983] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8991] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.8992] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9001] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9012] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9020] device (eth1): carrier: link connected
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9026] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9034] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb) (indicated)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9034] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9041] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9052] device (eth1): Activation: starting connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9080] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9099] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9101] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9104] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9110] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9112] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9115] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9118] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9125] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9128] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9140] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9142] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9179] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9185] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9196] device (lo): Activation: successful, device activated.
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9204] dhcp4 (eth0): state changed new lease, address=38.102.83.164
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9209] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9360] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9412] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9416] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9423] manager: NetworkManager state is now CONNECTED_SITE
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9427] device (eth0): Activation: successful, device activated.
Feb 23 06:42:24 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828944.9437] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 23 06:42:25 np0005626463.novalocal python3[6018]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-116e-582b-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:42:35 np0005626463.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 06:42:54 np0005626463.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 06:43:09 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828989.8108] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 23 06:43:09 np0005626463.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 06:43:09 np0005626463.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 06:43:09 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828989.8309] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 23 06:43:09 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828989.8313] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 23 06:43:09 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828989.8323] device (eth1): Activation: successful, device activated.
Feb 23 06:43:09 np0005626463.novalocal NetworkManager[5974]: <info>  [1771828989.8332] manager: startup complete
Feb 23 06:43:09 np0005626463.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 23 06:43:19 np0005626463.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 06:43:25 np0005626463.novalocal sshd[5819]: Received disconnect from 38.102.83.114 port 48796:11: disconnected by user
Feb 23 06:43:25 np0005626463.novalocal sshd[5819]: Disconnected from user zuul 38.102.83.114 port 48796
Feb 23 06:43:25 np0005626463.novalocal sshd[5816]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:43:25 np0005626463.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 23 06:43:25 np0005626463.novalocal systemd[1]: session-3.scope: Consumed 1.457s CPU time.
Feb 23 06:43:25 np0005626463.novalocal systemd-logind[759]: Session 3 logged out. Waiting for processes to exit.
Feb 23 06:43:25 np0005626463.novalocal systemd-logind[759]: Removed session 3.
Feb 23 06:43:44 np0005626463.novalocal chronyd[765]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org)
Feb 23 06:44:15 np0005626463.novalocal sshd[6057]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:44:16 np0005626463.novalocal sshd[6057]: Connection closed by authenticating user sshd 185.156.73.233 port 60638 [preauth]
Feb 23 06:44:32 np0005626463.novalocal sshd[6059]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:44:32 np0005626463.novalocal sshd[6059]: Accepted publickey for zuul from 38.102.83.114 port 45474 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:44:32 np0005626463.novalocal systemd-logind[759]: New session 4 of user zuul.
Feb 23 06:44:32 np0005626463.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 23 06:44:32 np0005626463.novalocal sshd[6059]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:44:32 np0005626463.novalocal sudo[6108]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aruqogdijghzojlxwyznhjvaawpobisj ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 23 06:44:32 np0005626463.novalocal sudo[6108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:44:32 np0005626463.novalocal python3[6110]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:44:32 np0005626463.novalocal sudo[6108]: pam_unix(sudo:session): session closed for user root
Feb 23 06:44:32 np0005626463.novalocal sudo[6151]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdbuvpxzymnayhfstrdfgajuodbjndlt ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 23 06:44:32 np0005626463.novalocal sudo[6151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:44:32 np0005626463.novalocal python3[6153]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771829072.3180947-628-278240485659165/source _original_basename=tmp9nrsig8v follow=False checksum=393f60ce964bed22379b4d5935087c828e1455a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:44:32 np0005626463.novalocal sudo[6151]: pam_unix(sudo:session): session closed for user root
Feb 23 06:44:37 np0005626463.novalocal sshd[6059]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:44:37 np0005626463.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 23 06:44:37 np0005626463.novalocal systemd-logind[759]: Session 4 logged out. Waiting for processes to exit.
Feb 23 06:44:37 np0005626463.novalocal systemd-logind[759]: Removed session 4.
Feb 23 06:48:40 np0005626463.novalocal sshd[6170]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:48:40 np0005626463.novalocal sshd[6170]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:49:42 np0005626463.novalocal sshd[6172]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:49:43 np0005626463.novalocal sshd[6172]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:50:43 np0005626463.novalocal sshd[6175]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:50:43 np0005626463.novalocal sshd[6175]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:51:03 np0005626463.novalocal sshd[6179]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:51:03 np0005626463.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 23 06:51:03 np0005626463.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 23 06:51:03 np0005626463.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 23 06:51:03 np0005626463.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 23 06:51:03 np0005626463.novalocal sshd[6179]: Accepted publickey for zuul from 38.102.83.114 port 39676 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:51:03 np0005626463.novalocal systemd-logind[759]: New session 5 of user zuul.
Feb 23 06:51:03 np0005626463.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 23 06:51:03 np0005626463.novalocal sshd[6179]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:51:03 np0005626463.novalocal sudo[6198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vebuaidtbxqngypmbfafohxmcmjqjztp ; /usr/bin/python3
Feb 23 06:51:03 np0005626463.novalocal sudo[6198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:03 np0005626463.novalocal python3[6200]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8ad4-7d7f-00000000219f-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:03 np0005626463.novalocal sudo[6198]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:15 np0005626463.novalocal sudo[6217]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymcbwexwnnoyacmwtccedphqngbdzbmy ; /usr/bin/python3
Feb 23 06:51:15 np0005626463.novalocal sudo[6217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:15 np0005626463.novalocal python3[6219]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:15 np0005626463.novalocal sudo[6217]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:15 np0005626463.novalocal sudo[6233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kklcsvitfcbnlovftlfydskiuptnseri ; /usr/bin/python3
Feb 23 06:51:15 np0005626463.novalocal sudo[6233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:15 np0005626463.novalocal python3[6235]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:15 np0005626463.novalocal sudo[6233]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:15 np0005626463.novalocal sudo[6249]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eluwusmlvenlskphzegiobwydoaktrhe ; /usr/bin/python3
Feb 23 06:51:15 np0005626463.novalocal sudo[6249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:15 np0005626463.novalocal python3[6251]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:15 np0005626463.novalocal sudo[6249]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:15 np0005626463.novalocal sudo[6265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elliuglfwksznbqyzynrapijnepcnriy ; /usr/bin/python3
Feb 23 06:51:15 np0005626463.novalocal sudo[6265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:16 np0005626463.novalocal python3[6267]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:16 np0005626463.novalocal sudo[6265]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:16 np0005626463.novalocal sudo[6281]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxovsejfebcgzygepifhyfayckfzfuho ; /usr/bin/python3
Feb 23 06:51:16 np0005626463.novalocal sudo[6281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:16 np0005626463.novalocal python3[6283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:16 np0005626463.novalocal sudo[6281]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:17 np0005626463.novalocal sudo[6329]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qitwipuondcwnowdrwfajfckefmsuzla ; /usr/bin/python3
Feb 23 06:51:17 np0005626463.novalocal sudo[6329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:18 np0005626463.novalocal python3[6331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:51:18 np0005626463.novalocal sudo[6329]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:18 np0005626463.novalocal sudo[6372]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skoakyvmmaonvmxbeerilbthyasjeglg ; /usr/bin/python3
Feb 23 06:51:18 np0005626463.novalocal sudo[6372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:18 np0005626463.novalocal python3[6374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771829477.7201293-660-186436323015140/source _original_basename=tmprc1eb3c2 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:51:18 np0005626463.novalocal sudo[6372]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:19 np0005626463.novalocal sudo[6402]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nneerrfacjnjemzgqcemdrilstxujqxg ; /usr/bin/python3
Feb 23 06:51:19 np0005626463.novalocal sudo[6402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:19 np0005626463.novalocal python3[6404]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 06:51:19 np0005626463.novalocal systemd[1]: Reloading.
Feb 23 06:51:20 np0005626463.novalocal systemd-rc-local-generator[6422]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 06:51:20 np0005626463.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 06:51:20 np0005626463.novalocal sudo[6402]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:21 np0005626463.novalocal sudo[6449]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgaxaqjezrdycmtzbcelbrnrdlsvznlg ; /usr/bin/python3
Feb 23 06:51:21 np0005626463.novalocal sudo[6449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:21 np0005626463.novalocal python3[6451]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 23 06:51:21 np0005626463.novalocal sudo[6449]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:22 np0005626463.novalocal sudo[6465]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-algfegsduiehlyrepagpcbkpzldaxrwk ; /usr/bin/python3
Feb 23 06:51:22 np0005626463.novalocal sudo[6465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:22 np0005626463.novalocal python3[6467]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:22 np0005626463.novalocal sudo[6465]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:23 np0005626463.novalocal sudo[6483]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmuzudjitqglwsfuymgjzhcqlfydefzy ; /usr/bin/python3
Feb 23 06:51:23 np0005626463.novalocal sudo[6483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:23 np0005626463.novalocal python3[6485]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:23 np0005626463.novalocal sudo[6483]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:23 np0005626463.novalocal sudo[6501]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvvfohiiiiwmcxdysnvedpmmpydbaagv ; /usr/bin/python3
Feb 23 06:51:23 np0005626463.novalocal sudo[6501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:23 np0005626463.novalocal python3[6503]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:23 np0005626463.novalocal sudo[6501]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:23 np0005626463.novalocal sudo[6519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tplievcknjysxsvoreolqnfvjttprphy ; /usr/bin/python3
Feb 23 06:51:23 np0005626463.novalocal sudo[6519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:51:23 np0005626463.novalocal python3[6521]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:23 np0005626463.novalocal sudo[6519]: pam_unix(sudo:session): session closed for user root
Feb 23 06:51:24 np0005626463.novalocal python3[6538]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8ad4-7d7f-0000000021a6-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:51:35 np0005626463.novalocal python3[6558]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 06:51:38 np0005626463.novalocal sshd[6179]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:51:38 np0005626463.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 23 06:51:38 np0005626463.novalocal systemd[1]: session-5.scope: Consumed 4.056s CPU time.
Feb 23 06:51:38 np0005626463.novalocal systemd-logind[759]: Session 5 logged out. Waiting for processes to exit.
Feb 23 06:51:38 np0005626463.novalocal systemd-logind[759]: Removed session 5.
Feb 23 06:51:46 np0005626463.novalocal sshd[6563]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:51:47 np0005626463.novalocal sshd[6563]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:52:30 np0005626463.novalocal sshd[6567]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:52:30 np0005626463.novalocal sshd[6567]: Accepted publickey for zuul from 38.102.83.114 port 46046 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:52:30 np0005626463.novalocal systemd-logind[759]: New session 6 of user zuul.
Feb 23 06:52:30 np0005626463.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 23 06:52:30 np0005626463.novalocal sshd[6567]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:52:30 np0005626463.novalocal sudo[6584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkcqphqhavqpadmlofztcqzrgwdobfgd ; /usr/bin/python3
Feb 23 06:52:30 np0005626463.novalocal sudo[6584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:52:31 np0005626463.novalocal systemd[1]: Starting RHSM dbus service...
Feb 23 06:52:31 np0005626463.novalocal systemd[1]: Started RHSM dbus service.
Feb 23 06:52:31 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:31 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:31 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:31 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:34 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005626463.novalocal (71d8a449-76d3-4525-90bb-1ec088bb454f)
Feb 23 06:52:34 np0005626463.novalocal subscription-manager[6591]: Registered system with identity: 71d8a449-76d3-4525-90bb-1ec088bb454f
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.entcertlib:131] certs updated:
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]: Total updates: 1
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]: Found (local) serial# []
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]: Expected (UEP) serial# [3819360702608339394]
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]: Added (new)
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]:   [sn:3819360702608339394 ( Content Access,) @ /etc/pki/entitlement/3819360702608339394.pem]
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]: Deleted (rogue):
Feb 23 06:52:35 np0005626463.novalocal rhsm-service[6591]:   <NONE>
Feb 23 06:52:35 np0005626463.novalocal subscription-manager[6591]: Added subscription for 'Content Access' contract 'None'
Feb 23 06:52:35 np0005626463.novalocal subscription-manager[6591]: Added subscription for product ' Content Access'
Feb 23 06:52:36 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:36 np0005626463.novalocal rhsm-service[6591]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 23 06:52:36 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:52:36 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:52:36 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:52:37 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:52:37 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:52:37 np0005626463.novalocal sudo[6584]: pam_unix(sudo:session): session closed for user root
Feb 23 06:52:43 np0005626463.novalocal python3[6682]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-32b8-b7f7-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:52:44 np0005626463.novalocal sudo[6699]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqmqqomsojmxmysgxxganbnbocwxmavy ; /usr/bin/python3
Feb 23 06:52:44 np0005626463.novalocal sudo[6699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:52:45 np0005626463.novalocal python3[6701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 06:52:48 np0005626463.novalocal sshd[6708]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:52:48 np0005626463.novalocal sshd[6708]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:53:14 np0005626463.novalocal setsebool[6778]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 23 06:53:14 np0005626463.novalocal setsebool[6778]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  Converting 406 SID table entries...
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 06:53:22 np0005626463.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 06:53:35 np0005626463.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 23 06:53:35 np0005626463.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 06:53:35 np0005626463.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 23 06:53:35 np0005626463.novalocal systemd[1]: Reloading.
Feb 23 06:53:35 np0005626463.novalocal systemd-rc-local-generator[7639]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 06:53:35 np0005626463.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 06:53:35 np0005626463.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 06:53:36 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 06:53:36 np0005626463.novalocal sudo[6699]: pam_unix(sudo:session): session closed for user root
Feb 23 06:53:43 np0005626463.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 06:53:43 np0005626463.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 23 06:53:43 np0005626463.novalocal systemd[1]: man-db-cache-update.service: Consumed 9.292s CPU time.
Feb 23 06:53:43 np0005626463.novalocal systemd[1]: run-r81631f5033c74fd0a27ced99f8b99169.service: Deactivated successfully.
Feb 23 06:53:47 np0005626463.novalocal sshd[18356]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:53:47 np0005626463.novalocal sshd[18356]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:53:52 np0005626463.novalocal sshd[18358]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:53:54 np0005626463.novalocal sshd[18358]: Connection closed by authenticating user root 185.156.73.233 port 49050 [preauth]
Feb 23 06:54:29 np0005626463.novalocal sudo[18373]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djmgugjgvymwtjumcmonhzpyuroygxxp ; /usr/bin/python3
Feb 23 06:54:29 np0005626463.novalocal sudo[18373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:54:30 np0005626463.novalocal podman[18376]: 2026-02-23 06:54:30.056621751 +0000 UTC m=+0.106063630 system refresh
Feb 23 06:54:30 np0005626463.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 06:54:30 np0005626463.novalocal sudo[18373]: pam_unix(sudo:session): session closed for user root
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: Starting D-Bus User Message Bus...
Feb 23 06:54:30 np0005626463.novalocal dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 23 06:54:30 np0005626463.novalocal dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: Started D-Bus User Message Bus.
Feb 23 06:54:30 np0005626463.novalocal dbus-broker-lau[18433]: Ready
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: Created slice Slice /user.
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: podman-18416.scope: unit configures an IP firewall, but not running as root.
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: (This warning is only shown for the first unit using IP firewalling.)
Feb 23 06:54:30 np0005626463.novalocal systemd[4176]: Started podman-18416.scope.
Feb 23 06:54:31 np0005626463.novalocal systemd[4176]: Started podman-pause-a5e6aa7e.scope.
Feb 23 06:54:33 np0005626463.novalocal sshd[6567]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:54:33 np0005626463.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Feb 23 06:54:33 np0005626463.novalocal systemd[1]: session-6.scope: Consumed 49.807s CPU time.
Feb 23 06:54:33 np0005626463.novalocal systemd-logind[759]: Session 6 logged out. Waiting for processes to exit.
Feb 23 06:54:33 np0005626463.novalocal systemd-logind[759]: Removed session 6.
Feb 23 06:54:43 np0005626463.novalocal sshd[18436]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:44 np0005626463.novalocal sshd[18436]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:54:47 np0005626463.novalocal sshd[18438]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:47 np0005626463.novalocal sshd[18441]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:47 np0005626463.novalocal sshd[18439]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:48 np0005626463.novalocal sshd[18442]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:48 np0005626463.novalocal sshd[18440]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:48 np0005626463.novalocal sshd[18438]: Unable to negotiate with 38.102.83.20 port 53740: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 23 06:54:48 np0005626463.novalocal sshd[18442]: Connection closed by 38.102.83.20 port 53718 [preauth]
Feb 23 06:54:48 np0005626463.novalocal sshd[18441]: Connection closed by 38.102.83.20 port 53728 [preauth]
Feb 23 06:54:48 np0005626463.novalocal sshd[18439]: Unable to negotiate with 38.102.83.20 port 53732: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 23 06:54:48 np0005626463.novalocal sshd[18440]: Unable to negotiate with 38.102.83.20 port 53752: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 23 06:54:52 np0005626463.novalocal sshd[18448]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:54:52 np0005626463.novalocal sshd[18448]: Accepted publickey for zuul from 38.102.83.114 port 49792 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:54:52 np0005626463.novalocal systemd-logind[759]: New session 7 of user zuul.
Feb 23 06:54:52 np0005626463.novalocal systemd[1]: Started Session 7 of User zuul.
Feb 23 06:54:52 np0005626463.novalocal sshd[18448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:54:53 np0005626463.novalocal python3[18465]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD0suk+oGhrLCF0TQEPuL+1TMMXZ4ZyjwmaIk09J9Zppa5UYl2p4E22RKwDBWJVKjp5+lVBFxSdpKjyFnuMgKyY= zuul@np0005626456.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:54:53 np0005626463.novalocal sudo[18479]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxusxpeknhflzxgpbelsmrhxqycbjnkb ; /usr/bin/python3
Feb 23 06:54:53 np0005626463.novalocal sudo[18479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:54:53 np0005626463.novalocal python3[18481]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD0suk+oGhrLCF0TQEPuL+1TMMXZ4ZyjwmaIk09J9Zppa5UYl2p4E22RKwDBWJVKjp5+lVBFxSdpKjyFnuMgKyY= zuul@np0005626456.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:54:54 np0005626463.novalocal sudo[18479]: pam_unix(sudo:session): session closed for user root
Feb 23 06:54:55 np0005626463.novalocal sshd[18448]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:54:55 np0005626463.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Feb 23 06:54:55 np0005626463.novalocal systemd-logind[759]: Session 7 logged out. Waiting for processes to exit.
Feb 23 06:54:55 np0005626463.novalocal systemd-logind[759]: Removed session 7.
Feb 23 06:55:38 np0005626463.novalocal sshd[18482]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:55:38 np0005626463.novalocal sshd[18482]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:56:13 np0005626463.novalocal sshd[18486]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:56:13 np0005626463.novalocal sshd[18486]: Accepted publickey for zuul from 38.102.83.114 port 44302 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:56:13 np0005626463.novalocal systemd-logind[759]: New session 8 of user zuul.
Feb 23 06:56:13 np0005626463.novalocal systemd[1]: Started Session 8 of User zuul.
Feb 23 06:56:13 np0005626463.novalocal sshd[18486]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:56:13 np0005626463.novalocal sudo[18503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfafkmqmqmgysuwigkhrjtypnmpyoguf ; /usr/bin/python3
Feb 23 06:56:13 np0005626463.novalocal sudo[18503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:13 np0005626463.novalocal python3[18505]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 06:56:13 np0005626463.novalocal sudo[18503]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:14 np0005626463.novalocal sudo[18519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toixcoijmnibmklaudwitqcxyxqacqpu ; /usr/bin/python3
Feb 23 06:56:14 np0005626463.novalocal sudo[18519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:14 np0005626463.novalocal python3[18521]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 23 06:56:14 np0005626463.novalocal sudo[18519]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:16 np0005626463.novalocal sudo[18569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhoaveibxufszhvrxwkryxvcsshutwpt ; /usr/bin/python3
Feb 23 06:56:16 np0005626463.novalocal sudo[18569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:16 np0005626463.novalocal python3[18571]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:56:16 np0005626463.novalocal sudo[18569]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:16 np0005626463.novalocal sudo[18612]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aogspdrnywsmkraemzumtexclvvslyax ; /usr/bin/python3
Feb 23 06:56:16 np0005626463.novalocal sudo[18612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:16 np0005626463.novalocal python3[18614]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771829775.99336-133-41768577803103/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa follow=False checksum=3856428e4c0cdf708f3b02cf6f4769559d121f25 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:16 np0005626463.novalocal sudo[18612]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:17 np0005626463.novalocal sudo[18674]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgymydysmvyeoiiggwdprztoydnajhpf ; /usr/bin/python3
Feb 23 06:56:17 np0005626463.novalocal sudo[18674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:17 np0005626463.novalocal python3[18676]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:56:17 np0005626463.novalocal sudo[18674]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:18 np0005626463.novalocal sudo[18717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbulwwvmlttkndqeykhgpzuvwhslavgx ; /usr/bin/python3
Feb 23 06:56:18 np0005626463.novalocal sudo[18717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:18 np0005626463.novalocal python3[18719]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771829777.5856373-219-264846205402667/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa.pub follow=False checksum=24c5085c987d798738c880bb8143c9f9cd19ae33 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:18 np0005626463.novalocal sudo[18717]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:20 np0005626463.novalocal sudo[18747]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyynkzcqorfysjoqtsuzdfzlohubmire ; /usr/bin/python3
Feb 23 06:56:20 np0005626463.novalocal sudo[18747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 06:56:20 np0005626463.novalocal python3[18749]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:20 np0005626463.novalocal sudo[18747]: pam_unix(sudo:session): session closed for user root
Feb 23 06:56:21 np0005626463.novalocal python3[18795]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:56:21 np0005626463.novalocal python3[18811]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpd42ngr18 recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:22 np0005626463.novalocal python3[18871]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:56:22 np0005626463.novalocal python3[18887]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpr86mzc8q recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:24 np0005626463.novalocal python3[18947]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 06:56:24 np0005626463.novalocal python3[18963]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpglgfau3f recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 06:56:25 np0005626463.novalocal sshd[18486]: pam_unix(sshd:session): session closed for user zuul
Feb 23 06:56:25 np0005626463.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Feb 23 06:56:25 np0005626463.novalocal systemd[1]: session-8.scope: Consumed 3.582s CPU time.
Feb 23 06:56:25 np0005626463.novalocal systemd-logind[759]: Session 8 logged out. Waiting for processes to exit.
Feb 23 06:56:25 np0005626463.novalocal systemd-logind[759]: Removed session 8.
Feb 23 06:56:34 np0005626463.novalocal sshd[18979]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:56:35 np0005626463.novalocal sshd[18979]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:57:30 np0005626463.novalocal sshd[18981]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:57:30 np0005626463.novalocal sshd[18981]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:58:26 np0005626463.novalocal sshd[18984]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:58:26 np0005626463.novalocal sshd[18985]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:58:26 np0005626463.novalocal sshd[18985]: Accepted publickey for zuul from 38.102.83.20 port 47686 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 06:58:26 np0005626463.novalocal systemd-logind[759]: New session 9 of user zuul.
Feb 23 06:58:26 np0005626463.novalocal systemd[1]: Started Session 9 of User zuul.
Feb 23 06:58:26 np0005626463.novalocal sshd[18985]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 06:58:26 np0005626463.novalocal sshd[18984]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 06:58:26 np0005626463.novalocal python3[19032]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 06:59:23 np0005626463.novalocal sshd[19034]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 06:59:23 np0005626463.novalocal sshd[19034]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:00:18 np0005626463.novalocal sshd[19037]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:00:18 np0005626463.novalocal sshd[19037]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:01:01 np0005626463.novalocal CROND[19040]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 07:01:01 np0005626463.novalocal run-parts[19043]: (/etc/cron.hourly) starting 0anacron
Feb 23 07:01:01 np0005626463.novalocal anacron[19051]: Anacron started on 2026-02-23
Feb 23 07:01:01 np0005626463.novalocal anacron[19051]: Will run job `cron.daily' in 29 min.
Feb 23 07:01:01 np0005626463.novalocal anacron[19051]: Will run job `cron.weekly' in 49 min.
Feb 23 07:01:01 np0005626463.novalocal anacron[19051]: Will run job `cron.monthly' in 69 min.
Feb 23 07:01:01 np0005626463.novalocal anacron[19051]: Jobs will be executed sequentially
Feb 23 07:01:01 np0005626463.novalocal run-parts[19053]: (/etc/cron.hourly) finished 0anacron
Feb 23 07:01:01 np0005626463.novalocal CROND[19039]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 07:01:13 np0005626463.novalocal sshd[19054]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:01:13 np0005626463.novalocal sshd[19054]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:02:07 np0005626463.novalocal sshd[19057]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:02:07 np0005626463.novalocal sshd[19057]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:03:02 np0005626463.novalocal sshd[19059]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:03:03 np0005626463.novalocal sshd[19059]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:03:26 np0005626463.novalocal sshd[18989]: Received disconnect from 38.102.83.20 port 47686:11: disconnected by user
Feb 23 07:03:26 np0005626463.novalocal sshd[18989]: Disconnected from user zuul 38.102.83.20 port 47686
Feb 23 07:03:26 np0005626463.novalocal sshd[18985]: pam_unix(sshd:session): session closed for user zuul
Feb 23 07:03:26 np0005626463.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Feb 23 07:03:26 np0005626463.novalocal systemd-logind[759]: Session 9 logged out. Waiting for processes to exit.
Feb 23 07:03:26 np0005626463.novalocal systemd-logind[759]: Removed session 9.
Feb 23 07:03:41 np0005626463.novalocal sshd[19062]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:03:42 np0005626463.novalocal sshd[19062]: Invalid user teste from 185.156.73.233 port 19940
Feb 23 07:03:43 np0005626463.novalocal sshd[19062]: Connection closed by invalid user teste 185.156.73.233 port 19940 [preauth]
Feb 23 07:04:02 np0005626463.novalocal sshd[19064]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:04:02 np0005626463.novalocal sshd[19064]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:05:02 np0005626463.novalocal sshd[19066]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:05:02 np0005626463.novalocal sshd[19066]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:06:01 np0005626463.novalocal sshd[19069]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:06:01 np0005626463.novalocal sshd[19069]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:06:57 np0005626463.novalocal sshd[19071]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:06:57 np0005626463.novalocal sshd[19071]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:07:49 np0005626463.novalocal sshd[19074]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:07:50 np0005626463.novalocal sshd[19074]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:08:42 np0005626463.novalocal sshd[19076]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:08:43 np0005626463.novalocal sshd[19076]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:09:37 np0005626463.novalocal sshd[19079]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:09:37 np0005626463.novalocal sshd[19079]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:09:57 np0005626463.novalocal sshd[19082]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:09:57 np0005626463.novalocal sshd[19082]: Accepted publickey for zuul from 38.102.83.114 port 37580 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:09:57 np0005626463.novalocal systemd-logind[759]: New session 10 of user zuul.
Feb 23 07:09:57 np0005626463.novalocal systemd[1]: Started Session 10 of User zuul.
Feb 23 07:09:57 np0005626463.novalocal sshd[19082]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 07:09:57 np0005626463.novalocal python3[19099]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-669c-02d2-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:09:59 np0005626463.novalocal sudo[19117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crcrxzmarutuuauvbnhtfttvezknjhsl ; /usr/bin/python3
Feb 23 07:09:59 np0005626463.novalocal sudo[19117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:09:59 np0005626463.novalocal python3[19119]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-669c-02d2-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:10:01 np0005626463.novalocal sudo[19117]: pam_unix(sudo:session): session closed for user root
Feb 23 07:10:04 np0005626463.novalocal sudo[19136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azdcanhcujapnwlnzmnyciobpwzfymjn ; /usr/bin/python3
Feb 23 07:10:04 np0005626463.novalocal sudo[19136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:10:04 np0005626463.novalocal python3[19138]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Feb 23 07:10:07 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:10:07 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:10:31 np0005626463.novalocal sudo[19136]: pam_unix(sudo:session): session closed for user root
Feb 23 07:10:31 np0005626463.novalocal sshd[19280]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:10:32 np0005626463.novalocal sshd[19280]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:11:04 np0005626463.novalocal sudo[19297]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osictpidniykozrehlkvvmveoerqrhsq ; /usr/bin/python3
Feb 23 07:11:04 np0005626463.novalocal sudo[19297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:11:04 np0005626463.novalocal python3[19299]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Feb 23 07:11:07 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:09 np0005626463.novalocal sudo[19297]: pam_unix(sudo:session): session closed for user root
Feb 23 07:11:14 np0005626463.novalocal sudo[19496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuicvtlfnxcdhcmnsgwenttfaeuyrpgv ; /usr/bin/python3
Feb 23 07:11:14 np0005626463.novalocal sudo[19496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:11:15 np0005626463.novalocal python3[19498]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Feb 23 07:11:17 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:18 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:22 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:22 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:28 np0005626463.novalocal sshd[19818]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:11:29 np0005626463.novalocal sudo[19496]: pam_unix(sudo:session): session closed for user root
Feb 23 07:11:29 np0005626463.novalocal sshd[19818]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:11:44 np0005626463.novalocal sudo[19833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oexjdcfzmpklhymbkdzkjgnldryjlcts ; /usr/bin/python3
Feb 23 07:11:44 np0005626463.novalocal sudo[19833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:11:44 np0005626463.novalocal python3[19835]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 23 07:11:46 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:47 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:51 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:11:57 np0005626463.novalocal sudo[19833]: pam_unix(sudo:session): session closed for user root
Feb 23 07:12:13 np0005626463.novalocal sudo[20109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmpfwwwfyjuppyeiirzuttdqfxlweqsr ; /usr/bin/python3
Feb 23 07:12:13 np0005626463.novalocal sudo[20109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:12:13 np0005626463.novalocal python3[20111]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 23 07:12:15 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:12:15 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:12:17 np0005626463.novalocal sshd[20238]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:12:17 np0005626463.novalocal sshd[20238]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 07:12:17 np0005626463.novalocal sshd[20238]: Connection closed by 92.118.39.72 port 48032
Feb 23 07:12:20 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:12:20 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:12:23 np0005626463.novalocal sshd[20365]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:12:23 np0005626463.novalocal sshd[20365]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:12:27 np0005626463.novalocal sudo[20109]: pam_unix(sudo:session): session closed for user root
Feb 23 07:12:42 np0005626463.novalocal sudo[20391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozlynkzzqmwryswrldnexznerzaoumsr ; /usr/bin/python3
Feb 23 07:12:42 np0005626463.novalocal sudo[20391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:12:42 np0005626463.novalocal python3[20393]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:12:44 np0005626463.novalocal sudo[20391]: pam_unix(sudo:session): session closed for user root
Feb 23 07:12:47 np0005626463.novalocal sudo[20410]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrgzsrkspspnlwhnjcagyitxawlmpyag ; /usr/bin/python3
Feb 23 07:12:47 np0005626463.novalocal sudo[20410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:12:47 np0005626463.novalocal python3[20412]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:12:59 np0005626463.novalocal groupadd[20498]: group added to /etc/group: name=unbound, GID=987
Feb 23 07:12:59 np0005626463.novalocal groupadd[20498]: group added to /etc/gshadow: name=unbound
Feb 23 07:12:59 np0005626463.novalocal groupadd[20498]: new group: name=unbound, GID=987
Feb 23 07:12:59 np0005626463.novalocal useradd[20505]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Feb 23 07:12:59 np0005626463.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  Converting 499 SID table entries...
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:13:09 np0005626463.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:13:09 np0005626463.novalocal groupadd[20719]: group added to /etc/group: name=openvswitch, GID=986
Feb 23 07:13:09 np0005626463.novalocal groupadd[20719]: group added to /etc/gshadow: name=openvswitch
Feb 23 07:13:09 np0005626463.novalocal groupadd[20719]: new group: name=openvswitch, GID=986
Feb 23 07:13:09 np0005626463.novalocal useradd[20726]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Feb 23 07:13:09 np0005626463.novalocal groupadd[20734]: group added to /etc/group: name=hugetlbfs, GID=985
Feb 23 07:13:09 np0005626463.novalocal groupadd[20734]: group added to /etc/gshadow: name=hugetlbfs
Feb 23 07:13:09 np0005626463.novalocal groupadd[20734]: new group: name=hugetlbfs, GID=985
Feb 23 07:13:09 np0005626463.novalocal usermod[20742]: add 'openvswitch' to group 'hugetlbfs'
Feb 23 07:13:09 np0005626463.novalocal usermod[20742]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 23 07:13:11 np0005626463.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 23 07:13:11 np0005626463.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:13:11 np0005626463.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:13:11 np0005626463.novalocal systemd[1]: Reloading.
Feb 23 07:13:11 np0005626463.novalocal systemd-rc-local-generator[21247]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:13:11 np0005626463.novalocal systemd-sysv-generator[21250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:13:11 np0005626463.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:13:11 np0005626463.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:13:12 np0005626463.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:13:12 np0005626463.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:13:12 np0005626463.novalocal systemd[1]: run-rb609a7387f0c424b8fcd92e8d489019d.service: Deactivated successfully.
Feb 23 07:13:13 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:13:13 np0005626463.novalocal sudo[20410]: pam_unix(sudo:session): session closed for user root
Feb 23 07:13:13 np0005626463.novalocal rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 07:13:17 np0005626463.novalocal sshd[21783]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:13:17 np0005626463.novalocal sshd[21783]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:13:48 np0005626463.novalocal sshd[21785]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:13:49 np0005626463.novalocal sshd[21785]: Invalid user support from 80.94.95.115 port 52908
Feb 23 07:13:49 np0005626463.novalocal sudo[21801]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcpogbynzlbjboinruuavhdefpskrxrv ; /usr/bin/python3
Feb 23 07:13:49 np0005626463.novalocal sudo[21801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:13:49 np0005626463.novalocal sshd[21785]: Connection closed by invalid user support 80.94.95.115 port 52908 [preauth]
Feb 23 07:13:49 np0005626463.novalocal python3[21803]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:14:02 np0005626463.novalocal sudo[21801]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:11 np0005626463.novalocal sshd[21808]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:14:11 np0005626463.novalocal sshd[21808]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:14:21 np0005626463.novalocal sudo[21824]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhelaqpjkicbsqppwugluutadlaxciae ; /usr/bin/python3
Feb 23 07:14:21 np0005626463.novalocal sudo[21824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:21 np0005626463.novalocal python3[21826]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:14:21 np0005626463.novalocal sudo[21824]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:22 np0005626463.novalocal sudo[21872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eguufjonaeufsbrfseueogppinoppzdx ; /usr/bin/python3
Feb 23 07:14:22 np0005626463.novalocal sudo[21872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:22 np0005626463.novalocal python3[21874]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:14:22 np0005626463.novalocal sudo[21872]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:22 np0005626463.novalocal sudo[21915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjyoodakwpgmhxldkcwltjhtahdjmhpu ; /usr/bin/python3
Feb 23 07:14:22 np0005626463.novalocal sudo[21915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:23 np0005626463.novalocal python3[21917]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771830862.3150995-291-83123791134579/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:14:23 np0005626463.novalocal sudo[21915]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:24 np0005626463.novalocal sudo[21945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldktsyqmmlviffkuctgvczqpponwjyow ; /usr/bin/python3
Feb 23 07:14:24 np0005626463.novalocal sudo[21945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:24 np0005626463.novalocal python3[21947]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 23 07:14:24 np0005626463.novalocal sudo[21945]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:24 np0005626463.novalocal systemd-journald[618]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Feb 23 07:14:24 np0005626463.novalocal systemd-journald[618]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 07:14:24 np0005626463.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:14:24 np0005626463.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:14:24 np0005626463.novalocal sudo[21966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loeiujfursuugixjmokbqaxinnypvsoe ; /usr/bin/python3
Feb 23 07:14:24 np0005626463.novalocal sudo[21966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:24 np0005626463.novalocal python3[21968]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 23 07:14:24 np0005626463.novalocal sudo[21966]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:24 np0005626463.novalocal sudo[21986]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czhxxsalankedeexeterghtpejxriaqm ; /usr/bin/python3
Feb 23 07:14:24 np0005626463.novalocal sudo[21986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:25 np0005626463.novalocal python3[21988]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 23 07:14:25 np0005626463.novalocal sudo[21986]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:25 np0005626463.novalocal sudo[22006]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smwuhiumizpfmpgnaijaabdptdsuprqc ; /usr/bin/python3
Feb 23 07:14:25 np0005626463.novalocal sudo[22006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:25 np0005626463.novalocal python3[22008]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 23 07:14:25 np0005626463.novalocal sudo[22006]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:25 np0005626463.novalocal sudo[22026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzzusgsjgckmktvgypzjumnedeclyuza ; /usr/bin/python3
Feb 23 07:14:25 np0005626463.novalocal sudo[22026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:25 np0005626463.novalocal python3[22028]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 23 07:14:25 np0005626463.novalocal sudo[22026]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:27 np0005626463.novalocal sudo[22046]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhrwkizbbgktiqaiblvpdgvnrnlexyie ; /usr/bin/python3
Feb 23 07:14:27 np0005626463.novalocal sudo[22046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:28 np0005626463.novalocal python3[22048]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:14:28 np0005626463.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Feb 23 07:14:28 np0005626463.novalocal network[22051]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:28 np0005626463.novalocal network[22062]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:28 np0005626463.novalocal network[22051]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:28 np0005626463.novalocal network[22063]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:28 np0005626463.novalocal network[22051]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 07:14:28 np0005626463.novalocal network[22064]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 07:14:28 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830868.2834] audit: op="connections-reload" pid=22092 uid=0 result="success"
Feb 23 07:14:28 np0005626463.novalocal network[22051]: Bringing up loopback interface:  [  OK  ]
Feb 23 07:14:28 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830868.4730] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22180 uid=0 result="success"
Feb 23 07:14:28 np0005626463.novalocal network[22051]: Bringing up interface eth0:  [  OK  ]
Feb 23 07:14:28 np0005626463.novalocal systemd[1]: Started LSB: Bring up/down networking.
Feb 23 07:14:28 np0005626463.novalocal sudo[22046]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:28 np0005626463.novalocal sudo[22219]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwabkxisennsjacgvbpupcjxiyczykux ; /usr/bin/python3
Feb 23 07:14:28 np0005626463.novalocal sudo[22219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:28 np0005626463.novalocal python3[22221]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Feb 23 07:14:29 np0005626463.novalocal chown[22225]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22230]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22230]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22230]: Starting ovsdb-server [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-vsctl[22279]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 23 07:14:29 np0005626463.novalocal ovs-vsctl[22299]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"96b5bb93-7341-4ce6-9b93-6a5de566c711\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22230]: Configuring Open vSwitch system IDs [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22230]: Enabling remote OVSDB managers [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-vsctl[22305]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005626463.novalocal
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Started Open vSwitch Database Unit.
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 23 07:14:29 np0005626463.novalocal kernel: openvswitch: Open vSwitch switching datapath
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22349]: Inserting openvswitch module [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22318]: Starting ovs-vswitchd [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-ctl[22318]: Enabling remote OVSDB managers [  OK  ]
Feb 23 07:14:29 np0005626463.novalocal ovs-vsctl[22367]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005626463.novalocal
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Starting Open vSwitch...
Feb 23 07:14:29 np0005626463.novalocal systemd[1]: Finished Open vSwitch.
Feb 23 07:14:29 np0005626463.novalocal sudo[22219]: pam_unix(sudo:session): session closed for user root
Feb 23 07:14:33 np0005626463.novalocal sudo[22383]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usoycqohqncvsiykdjwjbsyqbehnwsfy ; /usr/bin/python3
Feb 23 07:14:33 np0005626463.novalocal sudo[22383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:14:33 np0005626463.novalocal python3[22385]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.5153] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22581 uid=0 result="success"
Feb 23 07:14:34 np0005626463.novalocal ifup[22582]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:34 np0005626463.novalocal ifup[22583]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:34 np0005626463.novalocal ifup[22584]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.5498] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22590 uid=0 result="success"
Feb 23 07:14:34 np0005626463.novalocal ovs-vsctl[22592]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:f0:80:57 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Feb 23 07:14:34 np0005626463.novalocal kernel: device ovs-system entered promiscuous mode
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.5790] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Feb 23 07:14:34 np0005626463.novalocal kernel: Timeout policy base is empty
Feb 23 07:14:34 np0005626463.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Feb 23 07:14:34 np0005626463.novalocal systemd-udevd[22594]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:34 np0005626463.novalocal kernel: device br-ex entered promiscuous mode
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.6271] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.6550] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22619 uid=0 result="success"
Feb 23 07:14:34 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830874.6765] device (br-ex): carrier: link connected
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.7325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22648 uid=0 result="success"
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.7779] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22663 uid=0 result="success"
Feb 23 07:14:37 np0005626463.novalocal NET[22688]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.8657] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.8739] dhcp4 (eth1): canceled DHCP transaction
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.8739] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.8739] dhcp4 (eth1): state changed no lease
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.8779] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22697 uid=0 result="success"
Feb 23 07:14:37 np0005626463.novalocal ifup[22698]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:37 np0005626463.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 07:14:37 np0005626463.novalocal ifup[22699]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:37 np0005626463.novalocal ifup[22701]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:37 np0005626463.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.9113] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22715 uid=0 result="success"
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.9558] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22725 uid=0 result="success"
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.9626] device (eth1): carrier: link connected
Feb 23 07:14:37 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830877.9842] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22734 uid=0 result="success"
Feb 23 07:14:38 np0005626463.novalocal ipv6_wait_tentative[22746]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 23 07:14:39 np0005626463.novalocal ipv6_wait_tentative[22751]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.0566] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22761 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal ovs-vsctl[22776]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Feb 23 07:14:40 np0005626463.novalocal kernel: device eth1 entered promiscuous mode
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.1325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22783 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal ifup[22784]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:40 np0005626463.novalocal ifup[22785]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:40 np0005626463.novalocal ifup[22786]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.1641] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22792 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.2090] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22802 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal ifup[22803]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:40 np0005626463.novalocal ifup[22804]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:40 np0005626463.novalocal ifup[22805]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.2426] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22811 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal ovs-vsctl[22814]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 23 07:14:40 np0005626463.novalocal kernel: device vlan20 entered promiscuous mode
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.2851] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Feb 23 07:14:40 np0005626463.novalocal systemd-udevd[22816]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.3118] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22825 uid=0 result="success"
Feb 23 07:14:40 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830880.3338] device (vlan20): carrier: link connected
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.3950] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22854 uid=0 result="success"
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.4410] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22869 uid=0 result="success"
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.4999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22890 uid=0 result="success"
Feb 23 07:14:43 np0005626463.novalocal ifup[22891]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:43 np0005626463.novalocal ifup[22892]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:43 np0005626463.novalocal ifup[22893]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.5309] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22899 uid=0 result="success"
Feb 23 07:14:43 np0005626463.novalocal ovs-vsctl[22902]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 23 07:14:43 np0005626463.novalocal kernel: device vlan21 entered promiscuous mode
Feb 23 07:14:43 np0005626463.novalocal systemd-udevd[22904]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.5725] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.5953] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22914 uid=0 result="success"
Feb 23 07:14:43 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830883.6160] device (vlan21): carrier: link connected
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.6694] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22944 uid=0 result="success"
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.7159] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22959 uid=0 result="success"
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.7731] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22980 uid=0 result="success"
Feb 23 07:14:46 np0005626463.novalocal ifup[22981]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:46 np0005626463.novalocal ifup[22982]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:46 np0005626463.novalocal ifup[22983]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.8040] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22989 uid=0 result="success"
Feb 23 07:14:46 np0005626463.novalocal ovs-vsctl[22992]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 23 07:14:46 np0005626463.novalocal systemd-udevd[22994]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:46 np0005626463.novalocal kernel: device vlan23 entered promiscuous mode
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.8437] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.8678] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23004 uid=0 result="success"
Feb 23 07:14:46 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830886.8878] device (vlan23): carrier: link connected
Feb 23 07:14:47 np0005626463.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 07:14:49 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830889.9385] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23035 uid=0 result="success"
Feb 23 07:14:49 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830889.9847] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23050 uid=0 result="success"
Feb 23 07:14:50 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830890.0281] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23071 uid=0 result="success"
Feb 23 07:14:50 np0005626463.novalocal ifup[23072]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:50 np0005626463.novalocal ifup[23073]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:50 np0005626463.novalocal ifup[23074]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:50 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830890.0561] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23080 uid=0 result="success"
Feb 23 07:14:50 np0005626463.novalocal ovs-vsctl[23083]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 23 07:14:50 np0005626463.novalocal systemd-udevd[23085]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:50 np0005626463.novalocal kernel: device vlan22 entered promiscuous mode
Feb 23 07:14:50 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830890.0900] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Feb 23 07:14:50 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830890.1106] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23095 uid=0 result="success"
Feb 23 07:14:50 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830890.1300] device (vlan22): carrier: link connected
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.1942] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23125 uid=0 result="success"
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.2423] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23140 uid=0 result="success"
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.3036] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23161 uid=0 result="success"
Feb 23 07:14:53 np0005626463.novalocal ifup[23162]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:53 np0005626463.novalocal ifup[23163]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:53 np0005626463.novalocal ifup[23164]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.3368] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23170 uid=0 result="success"
Feb 23 07:14:53 np0005626463.novalocal ovs-vsctl[23173]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 23 07:14:53 np0005626463.novalocal systemd-udevd[23175]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 07:14:53 np0005626463.novalocal kernel: device vlan44 entered promiscuous mode
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.3842] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.4125] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23185 uid=0 result="success"
Feb 23 07:14:53 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830893.4369] device (vlan44): carrier: link connected
Feb 23 07:14:56 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830896.4946] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23215 uid=0 result="success"
Feb 23 07:14:56 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830896.5474] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23230 uid=0 result="success"
Feb 23 07:14:56 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830896.6128] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23251 uid=0 result="success"
Feb 23 07:14:56 np0005626463.novalocal ifup[23252]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:56 np0005626463.novalocal ifup[23253]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:56 np0005626463.novalocal ifup[23254]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:56 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830896.6469] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23260 uid=0 result="success"
Feb 23 07:14:56 np0005626463.novalocal ovs-vsctl[23263]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 23 07:14:56 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830896.7126] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23270 uid=0 result="success"
Feb 23 07:14:57 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830897.7811] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23297 uid=0 result="success"
Feb 23 07:14:57 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830897.8281] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23312 uid=0 result="success"
Feb 23 07:14:57 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830897.8897] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23333 uid=0 result="success"
Feb 23 07:14:57 np0005626463.novalocal ifup[23334]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:57 np0005626463.novalocal ifup[23335]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:57 np0005626463.novalocal ifup[23336]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:57 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830897.9231] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23342 uid=0 result="success"
Feb 23 07:14:57 np0005626463.novalocal ovs-vsctl[23345]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 23 07:14:57 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830897.9825] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23352 uid=0 result="success"
Feb 23 07:14:59 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830899.0422] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23380 uid=0 result="success"
Feb 23 07:14:59 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830899.0911] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23395 uid=0 result="success"
Feb 23 07:14:59 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830899.1485] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23416 uid=0 result="success"
Feb 23 07:14:59 np0005626463.novalocal ifup[23417]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:14:59 np0005626463.novalocal ifup[23418]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:14:59 np0005626463.novalocal ifup[23419]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:14:59 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830899.1690] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23425 uid=0 result="success"
Feb 23 07:14:59 np0005626463.novalocal ovs-vsctl[23428]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 23 07:14:59 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830899.2119] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23435 uid=0 result="success"
Feb 23 07:15:00 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830900.2674] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23463 uid=0 result="success"
Feb 23 07:15:00 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830900.3170] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23478 uid=0 result="success"
Feb 23 07:15:00 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830900.3767] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23499 uid=0 result="success"
Feb 23 07:15:00 np0005626463.novalocal ifup[23500]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:15:00 np0005626463.novalocal ifup[23501]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:15:00 np0005626463.novalocal ifup[23502]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:15:00 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830900.4093] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23508 uid=0 result="success"
Feb 23 07:15:00 np0005626463.novalocal ovs-vsctl[23511]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 23 07:15:00 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830900.4687] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23518 uid=0 result="success"
Feb 23 07:15:01 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830901.5263] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23546 uid=0 result="success"
Feb 23 07:15:01 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830901.5717] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23561 uid=0 result="success"
Feb 23 07:15:01 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830901.6304] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23582 uid=0 result="success"
Feb 23 07:15:01 np0005626463.novalocal ifup[23583]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 23 07:15:01 np0005626463.novalocal ifup[23584]: 'network-scripts' will be removed from distribution in near future.
Feb 23 07:15:01 np0005626463.novalocal ifup[23585]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 23 07:15:01 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830901.6625] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23591 uid=0 result="success"
Feb 23 07:15:01 np0005626463.novalocal ovs-vsctl[23594]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 23 07:15:01 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830901.7221] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23601 uid=0 result="success"
Feb 23 07:15:02 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830902.7855] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23629 uid=0 result="success"
Feb 23 07:15:02 np0005626463.novalocal NetworkManager[5974]: <info>  [1771830902.8342] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23644 uid=0 result="success"
Feb 23 07:15:02 np0005626463.novalocal sudo[22383]: pam_unix(sudo:session): session closed for user root
Feb 23 07:15:06 np0005626463.novalocal sshd[23662]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:15:07 np0005626463.novalocal sshd[23662]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:15:47 np0005626463.novalocal sshd[23665]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:15:47 np0005626463.novalocal sshd[23665]: Invalid user solana from 92.118.39.72 port 46588
Feb 23 07:15:48 np0005626463.novalocal sshd[23665]: Connection closed by invalid user solana 92.118.39.72 port 46588 [preauth]
Feb 23 07:15:55 np0005626463.novalocal python3[23681]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:16:01 np0005626463.novalocal python3[23700]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 07:16:01 np0005626463.novalocal sudo[23714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdtcolyziciczmlnilxsuivxamnmvhns ; /usr/bin/python3
Feb 23 07:16:01 np0005626463.novalocal sudo[23714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:16:01 np0005626463.novalocal python3[23716]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 07:16:01 np0005626463.novalocal sudo[23714]: pam_unix(sudo:session): session closed for user root
Feb 23 07:16:02 np0005626463.novalocal sshd[23717]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:16:02 np0005626463.novalocal sshd[23717]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:16:03 np0005626463.novalocal python3[23732]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 07:16:03 np0005626463.novalocal sudo[23746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvnyybdhnyuzzxpgcinihhypreojfgdm ; /usr/bin/python3
Feb 23 07:16:03 np0005626463.novalocal sudo[23746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:16:03 np0005626463.novalocal python3[23748]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 23 07:16:03 np0005626463.novalocal sudo[23746]: pam_unix(sudo:session): session closed for user root
Feb 23 07:16:04 np0005626463.novalocal python3[23762]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Feb 23 07:16:05 np0005626463.novalocal python3[23777]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005626463.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:16:06 np0005626463.novalocal sudo[23795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btvxfigoqgvebpxpnukdkdwrkppwuyfh ; /usr/bin/python3
Feb 23 07:16:06 np0005626463.novalocal sudo[23795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:16:06 np0005626463.novalocal python3[23797]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:16:06 np0005626463.novalocal systemd[1]: Starting Hostname Service...
Feb 23 07:16:06 np0005626463.novalocal systemd[1]: Started Hostname Service.
Feb 23 07:16:06 np0005626463.localdomain systemd-hostnamed[23801]: Hostname set to <np0005626463.localdomain> (static)
Feb 23 07:16:06 np0005626463.localdomain NetworkManager[5974]: <info>  [1771830966.8026] hostname: static hostname changed from "np0005626463.novalocal" to "np0005626463.localdomain"
Feb 23 07:16:06 np0005626463.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 23 07:16:06 np0005626463.localdomain sudo[23795]: pam_unix(sudo:session): session closed for user root
Feb 23 07:16:06 np0005626463.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 23 07:16:08 np0005626463.localdomain sshd[19082]: pam_unix(sshd:session): session closed for user zuul
Feb 23 07:16:08 np0005626463.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Feb 23 07:16:08 np0005626463.localdomain systemd[1]: session-10.scope: Consumed 1min 42.279s CPU time.
Feb 23 07:16:08 np0005626463.localdomain systemd-logind[759]: Session 10 logged out. Waiting for processes to exit.
Feb 23 07:16:08 np0005626463.localdomain systemd-logind[759]: Removed session 10.
Feb 23 07:16:10 np0005626463.localdomain sshd[23812]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:16:10 np0005626463.localdomain sshd[23812]: Accepted publickey for zuul from 38.102.83.114 port 34928 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:16:10 np0005626463.localdomain systemd[1]: Started Session 11 of User zuul.
Feb 23 07:16:10 np0005626463.localdomain systemd-logind[759]: New session 11 of user zuul.
Feb 23 07:16:10 np0005626463.localdomain sshd[23812]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 07:16:11 np0005626463.localdomain python3[23829]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 23 07:16:13 np0005626463.localdomain sshd[23812]: pam_unix(sshd:session): session closed for user zuul
Feb 23 07:16:13 np0005626463.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Feb 23 07:16:13 np0005626463.localdomain systemd-logind[759]: Session 11 logged out. Waiting for processes to exit.
Feb 23 07:16:13 np0005626463.localdomain systemd-logind[759]: Removed session 11.
Feb 23 07:16:16 np0005626463.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 23 07:16:36 np0005626463.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 07:16:58 np0005626463.localdomain sshd[23834]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:16:58 np0005626463.localdomain sshd[23834]: Accepted publickey for zuul from 38.102.83.114 port 38296 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:16:58 np0005626463.localdomain systemd-logind[759]: New session 12 of user zuul.
Feb 23 07:16:58 np0005626463.localdomain systemd[1]: Started Session 12 of User zuul.
Feb 23 07:16:59 np0005626463.localdomain sshd[23834]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 07:16:59 np0005626463.localdomain sudo[23851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxgavsyzzxefuqznhvtjhdhfyrsmoiyd ; /usr/bin/python3
Feb 23 07:16:59 np0005626463.localdomain sudo[23851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:16:59 np0005626463.localdomain python3[23853]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:16:59 np0005626463.localdomain sshd[23855]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:17:00 np0005626463.localdomain sshd[23855]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:17:03 np0005626463.localdomain systemd-rc-local-generator[23892]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:17:03 np0005626463.localdomain systemd-sysv-generator[23897]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Starting dnf makecache...
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:17:03 np0005626463.localdomain dnf[23910]: Updating Subscription Management repositories.
Feb 23 07:17:03 np0005626463.localdomain systemd-rc-local-generator[23937]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:17:03 np0005626463.localdomain systemd-sysv-generator[23941]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:17:03 np0005626463.localdomain systemd-rc-local-generator[23975]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:17:03 np0005626463.localdomain systemd-sysv-generator[23978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:17:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:17:04 np0005626463.localdomain systemd-rc-local-generator[24022]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:17:04 np0005626463.localdomain systemd-sysv-generator[24027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: run-rb78154d55188478394a5af81ba94bc52.service: Deactivated successfully.
Feb 23 07:17:04 np0005626463.localdomain systemd[1]: run-r32bec21cddeb4fa780f7571856fe16fe.service: Deactivated successfully.
Feb 23 07:17:05 np0005626463.localdomain dnf[23910]: Failed determining last makecache time.
Feb 23 07:17:05 np0005626463.localdomain sudo[23851]: pam_unix(sudo:session): session closed for user root
Feb 23 07:17:05 np0005626463.localdomain dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - High Av  29 kB/s | 4.0 kB     00:00
Feb 23 07:17:05 np0005626463.localdomain dnf[23910]: Fast Datapath for RHEL 9 x86_64 (RPMs)           26 kB/s | 4.0 kB     00:00
Feb 23 07:17:05 np0005626463.localdomain dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   28 kB/s | 4.1 kB     00:00
Feb 23 07:17:05 np0005626463.localdomain dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   30 kB/s | 4.1 kB     00:00
Feb 23 07:17:06 np0005626463.localdomain dnf[23910]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  30 kB/s | 4.0 kB     00:00
Feb 23 07:17:06 np0005626463.localdomain dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  31 kB/s | 4.5 kB     00:00
Feb 23 07:17:06 np0005626463.localdomain dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  33 kB/s | 4.5 kB     00:00
Feb 23 07:17:06 np0005626463.localdomain dnf[23910]: Metadata cache created.
Feb 23 07:17:06 np0005626463.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 23 07:17:06 np0005626463.localdomain systemd[1]: Finished dnf makecache.
Feb 23 07:17:06 np0005626463.localdomain systemd[1]: dnf-makecache.service: Consumed 2.833s CPU time.
Feb 23 07:17:40 np0005626463.localdomain sshd[24635]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:17:40 np0005626463.localdomain sshd[24635]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 07:17:40 np0005626463.localdomain sshd[24635]: Connection closed by 165.245.131.32 port 37054
Feb 23 07:17:56 np0005626463.localdomain sshd[24636]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:17:56 np0005626463.localdomain sshd[24636]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:18:05 np0005626463.localdomain sshd[23837]: Received disconnect from 38.102.83.114 port 38296:11: disconnected by user
Feb 23 07:18:05 np0005626463.localdomain sshd[23837]: Disconnected from user zuul 38.102.83.114 port 38296
Feb 23 07:18:05 np0005626463.localdomain sshd[23834]: pam_unix(sshd:session): session closed for user zuul
Feb 23 07:18:05 np0005626463.localdomain systemd-logind[759]: Session 12 logged out. Waiting for processes to exit.
Feb 23 07:18:05 np0005626463.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Feb 23 07:18:05 np0005626463.localdomain systemd[1]: session-12.scope: Consumed 4.768s CPU time.
Feb 23 07:18:05 np0005626463.localdomain systemd-logind[759]: Removed session 12.
Feb 23 07:18:51 np0005626463.localdomain sshd[24638]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:18:51 np0005626463.localdomain sshd[24638]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:19:05 np0005626463.localdomain sshd[24640]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:19:06 np0005626463.localdomain sshd[24640]: Invalid user sol from 92.118.39.72 port 42426
Feb 23 07:19:06 np0005626463.localdomain sshd[24640]: Connection closed by invalid user sol 92.118.39.72 port 42426 [preauth]
Feb 23 07:19:42 np0005626463.localdomain sshd[24642]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:19:43 np0005626463.localdomain sshd[24642]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:20:36 np0005626463.localdomain sshd[24644]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:20:36 np0005626463.localdomain sshd[24644]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:21:29 np0005626463.localdomain sshd[24646]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:21:30 np0005626463.localdomain sshd[24646]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:22:13 np0005626463.localdomain sshd[24649]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:22:14 np0005626463.localdomain sshd[24649]: Invalid user sol from 92.118.39.72 port 38282
Feb 23 07:22:14 np0005626463.localdomain sshd[24649]: Connection closed by invalid user sol 92.118.39.72 port 38282 [preauth]
Feb 23 07:22:23 np0005626463.localdomain sshd[24651]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:22:23 np0005626463.localdomain sshd[24651]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:24:12 np0005626463.localdomain sshd[24653]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:24:13 np0005626463.localdomain sshd[24653]: Invalid user config from 80.94.95.116 port 20552
Feb 23 07:24:14 np0005626463.localdomain sshd[24653]: Connection closed by invalid user config 80.94.95.116 port 20552 [preauth]
Feb 23 07:25:17 np0005626463.localdomain sshd[24655]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:25:18 np0005626463.localdomain sshd[24655]: Invalid user validator from 92.118.39.72 port 34128
Feb 23 07:25:18 np0005626463.localdomain sshd[24655]: Connection closed by invalid user validator 92.118.39.72 port 34128 [preauth]
Feb 23 07:28:14 np0005626463.localdomain sshd[24659]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:28:15 np0005626463.localdomain sshd[24659]: Invalid user blockchain from 92.118.39.72 port 58194
Feb 23 07:28:15 np0005626463.localdomain sshd[24659]: Connection closed by invalid user blockchain 92.118.39.72 port 58194 [preauth]
Feb 23 07:30:01 np0005626463.localdomain anacron[19051]: Job `cron.daily' started
Feb 23 07:30:01 np0005626463.localdomain anacron[19051]: Job `cron.daily' terminated
Feb 23 07:31:23 np0005626463.localdomain sshd[24663]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:31:23 np0005626463.localdomain sshd[24663]: Invalid user pool from 92.118.39.72 port 54040
Feb 23 07:31:23 np0005626463.localdomain sshd[24663]: Connection closed by invalid user pool 92.118.39.72 port 54040 [preauth]
Feb 23 07:34:09 np0005626463.localdomain sshd[24666]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:34:09 np0005626463.localdomain sshd[24666]: Accepted publickey for zuul from 192.168.122.100 port 44382 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:34:09 np0005626463.localdomain systemd-logind[759]: New session 13 of user zuul.
Feb 23 07:34:09 np0005626463.localdomain systemd[1]: Started Session 13 of User zuul.
Feb 23 07:34:09 np0005626463.localdomain sshd[24666]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 07:34:09 np0005626463.localdomain sudo[24712]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcuylysbkdbtlkkmgiehmyeekwcaptfg ; /usr/bin/python3
Feb 23 07:34:09 np0005626463.localdomain sudo[24712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:09 np0005626463.localdomain python3[24714]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 07:34:10 np0005626463.localdomain sudo[24712]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:11 np0005626463.localdomain sudo[24799]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oczyhtvuvjtayzmpadujrskmotnhgeyj ; /usr/bin/python3
Feb 23 07:34:11 np0005626463.localdomain sudo[24799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:11 np0005626463.localdomain python3[24801]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:34:14 np0005626463.localdomain sudo[24799]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:14 np0005626463.localdomain sudo[24816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nebtrrxgpptctdaqcbccuiokuqpzlkcz ; /usr/bin/python3
Feb 23 07:34:14 np0005626463.localdomain sudo[24816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:14 np0005626463.localdomain python3[24818]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:34:14 np0005626463.localdomain sudo[24816]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:15 np0005626463.localdomain sudo[24832]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-takpftorvxzoerhybtiganlchmvyvsyv ; /usr/bin/python3
Feb 23 07:34:15 np0005626463.localdomain sudo[24832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:15 np0005626463.localdomain python3[24834]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:15 np0005626463.localdomain kernel: loop: module loaded
Feb 23 07:34:15 np0005626463.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Feb 23 07:34:15 np0005626463.localdomain sudo[24832]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:15 np0005626463.localdomain sudo[24858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyeuphusmewkyxxjrhnipecoxtxwkydc ; /usr/bin/python3
Feb 23 07:34:15 np0005626463.localdomain sudo[24858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:15 np0005626463.localdomain python3[24860]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:16 np0005626463.localdomain lvm[24863]: PV /dev/loop3 not used.
Feb 23 07:34:16 np0005626463.localdomain lvm[24865]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 07:34:16 np0005626463.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Feb 23 07:34:16 np0005626463.localdomain lvm[24870]:   1 logical volume(s) in volume group "ceph_vg0" now active
Feb 23 07:34:16 np0005626463.localdomain lvm[24875]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 07:34:16 np0005626463.localdomain lvm[24875]: VG ceph_vg0 finished
Feb 23 07:34:16 np0005626463.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Feb 23 07:34:16 np0005626463.localdomain sudo[24858]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:16 np0005626463.localdomain sudo[24921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpayvinztbewgtwhdoycobhhiieaypjd ; /usr/bin/python3
Feb 23 07:34:16 np0005626463.localdomain sudo[24921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:16 np0005626463.localdomain python3[24923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:34:16 np0005626463.localdomain sudo[24921]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:17 np0005626463.localdomain sudo[24964]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmfyzvgsjvamneorutiqblpwuynyhpws ; /usr/bin/python3
Feb 23 07:34:17 np0005626463.localdomain sudo[24964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:17 np0005626463.localdomain python3[24966]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832056.4736516-55332-185809912784296/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:17 np0005626463.localdomain sudo[24964]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:17 np0005626463.localdomain sudo[24994]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymnbkwjtclegpmpxwfasuytqkfzwgyic ; /usr/bin/python3
Feb 23 07:34:17 np0005626463.localdomain sudo[24994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:18 np0005626463.localdomain python3[24996]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:34:19 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:34:19 np0005626463.localdomain systemd-sysv-generator[25029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:34:19 np0005626463.localdomain systemd-rc-local-generator[25022]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:34:19 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:34:19 np0005626463.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 23 07:34:19 np0005626463.localdomain bash[25038]: /dev/loop3: [64516]:8399529 (/var/lib/ceph-osd-0.img)
Feb 23 07:34:19 np0005626463.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 23 07:34:19 np0005626463.localdomain lvm[25040]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 07:34:19 np0005626463.localdomain lvm[25040]: VG ceph_vg0 finished
Feb 23 07:34:19 np0005626463.localdomain sudo[24994]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:19 np0005626463.localdomain sudo[25054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqaxugiaxdsnwuauqrcteemcaoxihbjz ; /usr/bin/python3
Feb 23 07:34:19 np0005626463.localdomain sudo[25054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:20 np0005626463.localdomain python3[25056]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:34:22 np0005626463.localdomain sudo[25054]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:22 np0005626463.localdomain sudo[25071]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaykrlnptpvadwgiwphilnhbwxdjilcv ; /usr/bin/python3
Feb 23 07:34:22 np0005626463.localdomain sudo[25071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:22 np0005626463.localdomain python3[25073]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:34:22 np0005626463.localdomain sudo[25071]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:23 np0005626463.localdomain sudo[25087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyuzilrevkwpzrinacgbsuimmkvzbdsh ; /usr/bin/python3
Feb 23 07:34:23 np0005626463.localdomain sudo[25087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:23 np0005626463.localdomain python3[25089]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:23 np0005626463.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Feb 23 07:34:23 np0005626463.localdomain sudo[25087]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:23 np0005626463.localdomain sudo[25109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzvjdzktbwayfctnegrboezzkiefeilj ; /usr/bin/python3
Feb 23 07:34:23 np0005626463.localdomain sudo[25109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:23 np0005626463.localdomain python3[25111]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:24 np0005626463.localdomain lvm[25114]: PV /dev/loop4 not used.
Feb 23 07:34:24 np0005626463.localdomain lvm[25116]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 07:34:24 np0005626463.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Feb 23 07:34:24 np0005626463.localdomain lvm[25126]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 07:34:24 np0005626463.localdomain lvm[25126]: VG ceph_vg1 finished
Feb 23 07:34:24 np0005626463.localdomain lvm[25127]:   1 logical volume(s) in volume group "ceph_vg1" now active
Feb 23 07:34:24 np0005626463.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Feb 23 07:34:24 np0005626463.localdomain sudo[25109]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:24 np0005626463.localdomain sudo[25173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfsgdmjsxycqhnyyiorprtcnzxegpcki ; /usr/bin/python3
Feb 23 07:34:24 np0005626463.localdomain sudo[25173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:24 np0005626463.localdomain python3[25175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:34:24 np0005626463.localdomain sudo[25173]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:25 np0005626463.localdomain sudo[25216]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwtcfoidzamarhgmmbfxyypklccuzcja ; /usr/bin/python3
Feb 23 07:34:25 np0005626463.localdomain sudo[25216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:25 np0005626463.localdomain python3[25218]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832064.5415137-55436-245063323876259/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:25 np0005626463.localdomain sudo[25216]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:25 np0005626463.localdomain sudo[25246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swytfbnhghovqaqgfjpcdrjhkwhcxedk ; /usr/bin/python3
Feb 23 07:34:25 np0005626463.localdomain sudo[25246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:25 np0005626463.localdomain python3[25248]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:34:26 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:34:26 np0005626463.localdomain systemd-sysv-generator[25281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:34:26 np0005626463.localdomain systemd-rc-local-generator[25276]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:34:26 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:34:27 np0005626463.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 23 07:34:27 np0005626463.localdomain bash[25289]: /dev/loop4: [64516]:9169183 (/var/lib/ceph-osd-1.img)
Feb 23 07:34:27 np0005626463.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 23 07:34:27 np0005626463.localdomain lvm[25290]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 07:34:27 np0005626463.localdomain lvm[25290]: VG ceph_vg1 finished
Feb 23 07:34:27 np0005626463.localdomain sudo[25246]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:30 np0005626463.localdomain sshd[25293]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:34:31 np0005626463.localdomain sshd[25293]: Invalid user miner from 92.118.39.72 port 49908
Feb 23 07:34:31 np0005626463.localdomain sshd[25293]: Connection closed by invalid user miner 92.118.39.72 port 49908 [preauth]
Feb 23 07:34:35 np0005626463.localdomain sshd[25338]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:34:35 np0005626463.localdomain sudo[25337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meeiabxfjygqfrljzyqjvaktyryezwym ; /usr/bin/python3
Feb 23 07:34:35 np0005626463.localdomain sudo[25337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:35 np0005626463.localdomain python3[25340]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 07:34:35 np0005626463.localdomain sudo[25337]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:36 np0005626463.localdomain sshd[25338]: Invalid user support from 80.94.95.115 port 47832
Feb 23 07:34:36 np0005626463.localdomain sudo[25359]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbsojejbyllmtsljkmqtiqvllqqbbuxq ; /usr/bin/python3
Feb 23 07:34:36 np0005626463.localdomain sudo[25359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:36 np0005626463.localdomain sshd[25338]: Connection closed by invalid user support 80.94.95.115 port 47832 [preauth]
Feb 23 07:34:37 np0005626463.localdomain python3[25361]: ansible-hostname Invoked with name=np0005626463.localdomain use=None
Feb 23 07:34:37 np0005626463.localdomain systemd[1]: Starting Hostname Service...
Feb 23 07:34:37 np0005626463.localdomain systemd[1]: Started Hostname Service.
Feb 23 07:34:37 np0005626463.localdomain sudo[25359]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:39 np0005626463.localdomain sudo[25382]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzwqjfevyfslozartquovftqmcygdgtz ; /usr/bin/python3
Feb 23 07:34:39 np0005626463.localdomain sudo[25382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:39 np0005626463.localdomain python3[25384]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 23 07:34:39 np0005626463.localdomain sudo[25382]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:39 np0005626463.localdomain sudo[25430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hutceeehjajiocjjgklkqfricuencesk ; /usr/bin/python3
Feb 23 07:34:39 np0005626463.localdomain sudo[25430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:40 np0005626463.localdomain python3[25432]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.eefre24ztmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:40 np0005626463.localdomain sudo[25430]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:40 np0005626463.localdomain sudo[25460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isluejpbvrvtoocwhrjneuvonhyascbb ; /usr/bin/python3
Feb 23 07:34:40 np0005626463.localdomain sudo[25460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:40 np0005626463.localdomain python3[25462]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.eefre24ztmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:40 np0005626463.localdomain sudo[25460]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:40 np0005626463.localdomain sudo[25476]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmfryptsvmwqzysktgzjlmaheftntkah ; /usr/bin/python3
Feb 23 07:34:40 np0005626463.localdomain sudo[25476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:41 np0005626463.localdomain python3[25478]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.eefre24ztmphosts insertbefore=BOF block=192.168.122.106 np0005626463.localdomain np0005626463
                                                         192.168.122.106 np0005626463.ctlplane.localdomain np0005626463.ctlplane
                                                         192.168.122.107 np0005626465.localdomain np0005626465
                                                         192.168.122.107 np0005626465.ctlplane.localdomain np0005626465.ctlplane
                                                         192.168.122.108 np0005626466.localdomain np0005626466
                                                         192.168.122.108 np0005626466.ctlplane.localdomain np0005626466.ctlplane
                                                         192.168.122.103 np0005626459.localdomain np0005626459
                                                         192.168.122.103 np0005626459.ctlplane.localdomain np0005626459.ctlplane
                                                         192.168.122.104 np0005626460.localdomain np0005626460
                                                         192.168.122.104 np0005626460.ctlplane.localdomain np0005626460.ctlplane
                                                         192.168.122.105 np0005626461.localdomain np0005626461
                                                         192.168.122.105 np0005626461.ctlplane.localdomain np0005626461.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:41 np0005626463.localdomain sudo[25476]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:41 np0005626463.localdomain sudo[25492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anuxmbrfpigknkaaazblbcjlkrhpdbhr ; /usr/bin/python3
Feb 23 07:34:41 np0005626463.localdomain sudo[25492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:41 np0005626463.localdomain python3[25494]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.eefre24ztmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:41 np0005626463.localdomain sudo[25492]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:41 np0005626463.localdomain sudo[25509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycdxnlicbxsmfbfegovxwaprcconcwxo ; /usr/bin/python3
Feb 23 07:34:41 np0005626463.localdomain sudo[25509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:42 np0005626463.localdomain python3[25511]: ansible-file Invoked with path=/tmp/ansible.eefre24ztmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:42 np0005626463.localdomain sudo[25509]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:44 np0005626463.localdomain sudo[25525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfejpheffwmmghxkzdmjvdvopzywsysk ; /usr/bin/python3
Feb 23 07:34:44 np0005626463.localdomain sudo[25525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:44 np0005626463.localdomain python3[25527]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:44 np0005626463.localdomain sudo[25525]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:44 np0005626463.localdomain sudo[25543]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpinzzuissssifzplbwmxthjmhvtfwsj ; /usr/bin/python3
Feb 23 07:34:44 np0005626463.localdomain sudo[25543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:45 np0005626463.localdomain python3[25545]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:34:47 np0005626463.localdomain sudo[25543]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:49 np0005626463.localdomain sudo[25592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofmmyvecrxhjakuiiczqdfmtrxzprokr ; /usr/bin/python3
Feb 23 07:34:49 np0005626463.localdomain sudo[25592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:49 np0005626463.localdomain python3[25594]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:34:49 np0005626463.localdomain sudo[25592]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:49 np0005626463.localdomain sudo[25637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kemjidrfcppogxmsnsyissqnwitobiqj ; /usr/bin/python3
Feb 23 07:34:49 np0005626463.localdomain sudo[25637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:49 np0005626463.localdomain python3[25639]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832088.7710097-56351-32313271728406/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:49 np0005626463.localdomain sudo[25637]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:50 np0005626463.localdomain sudo[25667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mszvlqmbujhugoptkgkllcskbkvmvalt ; /usr/bin/python3
Feb 23 07:34:50 np0005626463.localdomain sudo[25667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:51 np0005626463.localdomain python3[25669]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:34:52 np0005626463.localdomain sudo[25667]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:52 np0005626463.localdomain sudo[25685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ispdqcnzxgvlgepnmefsofqqzpiksmzr ; /usr/bin/python3
Feb 23 07:34:52 np0005626463.localdomain sudo[25685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:52 np0005626463.localdomain python3[25687]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:34:52 np0005626463.localdomain chronyd[765]: chronyd exiting
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: Stopping NTP client/server...
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: Stopped NTP client/server.
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: chronyd.service: Consumed 110ms CPU time, read 1.9M from disk, written 0B to disk.
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: Starting NTP client/server...
Feb 23 07:34:52 np0005626463.localdomain chronyd[25695]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 07:34:52 np0005626463.localdomain chronyd[25695]: Frequency -30.767 +/- 0.158 ppm read from /var/lib/chrony/drift
Feb 23 07:34:52 np0005626463.localdomain chronyd[25695]: Loaded seccomp filter (level 2)
Feb 23 07:34:52 np0005626463.localdomain systemd[1]: Started NTP client/server.
Feb 23 07:34:52 np0005626463.localdomain sudo[25685]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:54 np0005626463.localdomain sudo[25742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brrgkdpzckaiuciheqhvigmlydignvve ; /usr/bin/python3
Feb 23 07:34:54 np0005626463.localdomain sudo[25742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:54 np0005626463.localdomain python3[25744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:34:54 np0005626463.localdomain sudo[25742]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:54 np0005626463.localdomain sudo[25785]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtowovxibojzjsxvjduzeyjeaawxxvgq ; /usr/bin/python3
Feb 23 07:34:54 np0005626463.localdomain sudo[25785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:55 np0005626463.localdomain python3[25787]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832094.2500186-56514-214745890760840/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:34:55 np0005626463.localdomain sudo[25785]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:55 np0005626463.localdomain sudo[25815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptgdjyjiljhbyzmfaphlmdsfpfpaecys ; /usr/bin/python3
Feb 23 07:34:55 np0005626463.localdomain sudo[25815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:55 np0005626463.localdomain python3[25817]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:34:55 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:34:55 np0005626463.localdomain systemd-rc-local-generator[25840]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:34:55 np0005626463.localdomain systemd-sysv-generator[25845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:34:55 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:34:55 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:34:56 np0005626463.localdomain systemd-rc-local-generator[25885]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:34:56 np0005626463.localdomain systemd-sysv-generator[25888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:34:56 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:34:56 np0005626463.localdomain systemd[1]: Starting chronyd online sources service...
Feb 23 07:34:56 np0005626463.localdomain chronyc[25894]: 200 OK
Feb 23 07:34:56 np0005626463.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 23 07:34:56 np0005626463.localdomain systemd[1]: Finished chronyd online sources service.
Feb 23 07:34:56 np0005626463.localdomain sudo[25815]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:57 np0005626463.localdomain sudo[25908]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maingxnaegizryedajlvstqviwsffvqc ; /usr/bin/python3
Feb 23 07:34:57 np0005626463.localdomain sudo[25908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:57 np0005626463.localdomain python3[25910]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:57 np0005626463.localdomain chronyd[25695]: System clock was stepped by 0.000000 seconds
Feb 23 07:34:57 np0005626463.localdomain sudo[25908]: pam_unix(sudo:session): session closed for user root
Feb 23 07:34:57 np0005626463.localdomain sudo[25925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smblqwacudreujpuzhwyrrtxvpndvjhn ; /usr/bin/python3
Feb 23 07:34:57 np0005626463.localdomain sudo[25925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:34:57 np0005626463.localdomain python3[25927]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:34:57 np0005626463.localdomain chronyd[25695]: Selected source 167.160.187.179 (pool.ntp.org)
Feb 23 07:34:57 np0005626463.localdomain sudo[25925]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:07 np0005626463.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 23 07:35:07 np0005626463.localdomain sudo[25945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stpkupjczjdeirlnnrdgwjweyollhunt ; /usr/bin/python3
Feb 23 07:35:07 np0005626463.localdomain sudo[25945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:08 np0005626463.localdomain python3[25947]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 23 07:35:08 np0005626463.localdomain systemd[1]: Starting Time & Date Service...
Feb 23 07:35:08 np0005626463.localdomain systemd[1]: Started Time & Date Service.
Feb 23 07:35:08 np0005626463.localdomain sudo[25945]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:08 np0005626463.localdomain sudo[25965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beawhpqwneoghzsfxvjmcdancwcnxveo ; /usr/bin/python3
Feb 23 07:35:08 np0005626463.localdomain sudo[25965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:09 np0005626463.localdomain python3[25967]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:35:09 np0005626463.localdomain chronyd[25695]: chronyd exiting
Feb 23 07:35:09 np0005626463.localdomain systemd[1]: Stopping NTP client/server...
Feb 23 07:35:09 np0005626463.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 23 07:35:09 np0005626463.localdomain systemd[1]: Stopped NTP client/server.
Feb 23 07:35:09 np0005626463.localdomain systemd[1]: Starting NTP client/server...
Feb 23 07:35:09 np0005626463.localdomain chronyd[25974]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 07:35:09 np0005626463.localdomain chronyd[25974]: Frequency -30.767 +/- 0.170 ppm read from /var/lib/chrony/drift
Feb 23 07:35:09 np0005626463.localdomain chronyd[25974]: Loaded seccomp filter (level 2)
Feb 23 07:35:09 np0005626463.localdomain systemd[1]: Started NTP client/server.
Feb 23 07:35:09 np0005626463.localdomain sudo[25965]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:13 np0005626463.localdomain chronyd[25974]: Selected source 167.160.187.179 (pool.ntp.org)
Feb 23 07:35:24 np0005626463.localdomain sudo[25989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yubwbuwivwuwhvfotniuxzxnrzgbdmxq ; /usr/bin/python3
Feb 23 07:35:24 np0005626463.localdomain sudo[25989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:24 np0005626463.localdomain useradd[25993]: new group: name=ceph-admin, GID=1002
Feb 23 07:35:24 np0005626463.localdomain useradd[25993]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Feb 23 07:35:24 np0005626463.localdomain sudo[25989]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:25 np0005626463.localdomain sudo[26045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulwkhcurvinzpuohdvfbqehgjsmpwuet ; /usr/bin/python3
Feb 23 07:35:25 np0005626463.localdomain sudo[26045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:25 np0005626463.localdomain sudo[26045]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:25 np0005626463.localdomain sudo[26088]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeguludnnhntfrzxkxddwengkkynvoer ; /usr/bin/python3
Feb 23 07:35:25 np0005626463.localdomain sudo[26088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:25 np0005626463.localdomain sudo[26088]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:26 np0005626463.localdomain sudo[26118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hanydelgnlklmtqbtrcdbanthydceeap ; /usr/bin/python3
Feb 23 07:35:26 np0005626463.localdomain sudo[26118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:26 np0005626463.localdomain sudo[26118]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:26 np0005626463.localdomain sudo[26134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpaemvzqayeagqbofdwleanlpptkhhqv ; /usr/bin/python3
Feb 23 07:35:26 np0005626463.localdomain sudo[26134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:26 np0005626463.localdomain sudo[26134]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:26 np0005626463.localdomain sudo[26150]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxpqhslegyykaxukzvddahsbetweguzo ; /usr/bin/python3
Feb 23 07:35:26 np0005626463.localdomain sudo[26150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:26 np0005626463.localdomain sudo[26150]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:27 np0005626463.localdomain sudo[26166]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-assbkelzxscusmeagyflszsqtvyytkdp ; /usr/bin/python3
Feb 23 07:35:27 np0005626463.localdomain sudo[26166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:35:27 np0005626463.localdomain sudo[26166]: pam_unix(sudo:session): session closed for user root
Feb 23 07:35:28 np0005626463.localdomain sshd[26169]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:35:29 np0005626463.localdomain sshd[26169]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 07:35:29 np0005626463.localdomain sshd[26169]: Connection closed by 209.38.85.143 port 47486
Feb 23 07:35:38 np0005626463.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 07:36:51 np0005626463.localdomain sshd[26173]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:36:51 np0005626463.localdomain sshd[26173]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:37:03 np0005626463.localdomain sshd[26175]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:03 np0005626463.localdomain sshd[26175]: Accepted publickey for ceph-admin from 192.168.122.103 port 37576 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:03 np0005626463.localdomain systemd-logind[759]: New session 14 of user ceph-admin.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 1002.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Starting User Manager for UID 1002...
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:03 np0005626463.localdomain sshd[26192]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Queued start job for default target Main User Target.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Created slice User Application Slice.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Reached target Paths.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Reached target Timers.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Starting D-Bus User Message Bus Socket...
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Starting Create User's Volatile Files and Directories...
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Listening on D-Bus User Message Bus Socket.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Reached target Sockets.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Finished Create User's Volatile Files and Directories.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Reached target Basic System.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Reached target Main User Target.
Feb 23 07:37:03 np0005626463.localdomain systemd[26179]: Startup finished in 117ms.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Started User Manager for UID 1002.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Feb 23 07:37:03 np0005626463.localdomain sshd[26175]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:03 np0005626463.localdomain sshd[26192]: Accepted publickey for ceph-admin from 192.168.122.103 port 37592 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:03 np0005626463.localdomain systemd-logind[759]: New session 16 of user ceph-admin.
Feb 23 07:37:03 np0005626463.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Feb 23 07:37:03 np0005626463.localdomain sshd[26192]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:04 np0005626463.localdomain sudo[26199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:04 np0005626463.localdomain sudo[26199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:04 np0005626463.localdomain sudo[26199]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:04 np0005626463.localdomain sshd[26214]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:04 np0005626463.localdomain sshd[26214]: Accepted publickey for ceph-admin from 192.168.122.103 port 37602 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:04 np0005626463.localdomain systemd-logind[759]: New session 17 of user ceph-admin.
Feb 23 07:37:04 np0005626463.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Feb 23 07:37:04 np0005626463.localdomain sshd[26214]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:04 np0005626463.localdomain sudo[26218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host --expect-hostname np0005626463.localdomain
Feb 23 07:37:04 np0005626463.localdomain sudo[26218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:04 np0005626463.localdomain sudo[26218]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:04 np0005626463.localdomain sshd[26233]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:04 np0005626463.localdomain sshd[26233]: Accepted publickey for ceph-admin from 192.168.122.103 port 37610 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:04 np0005626463.localdomain systemd-logind[759]: New session 18 of user ceph-admin.
Feb 23 07:37:04 np0005626463.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Feb 23 07:37:04 np0005626463.localdomain sshd[26233]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:04 np0005626463.localdomain sudo[26237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f
Feb 23 07:37:04 np0005626463.localdomain sudo[26237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:04 np0005626463.localdomain sudo[26237]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:05 np0005626463.localdomain sshd[26252]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:05 np0005626463.localdomain sshd[26252]: Accepted publickey for ceph-admin from 192.168.122.103 port 37616 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:05 np0005626463.localdomain systemd-logind[759]: New session 19 of user ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain sshd[26252]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:05 np0005626463.localdomain sudo[26256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:37:05 np0005626463.localdomain sudo[26256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:05 np0005626463.localdomain sudo[26256]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:05 np0005626463.localdomain sshd[26271]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:05 np0005626463.localdomain sshd[26271]: Accepted publickey for ceph-admin from 192.168.122.103 port 37622 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:05 np0005626463.localdomain systemd-logind[759]: New session 20 of user ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain sshd[26271]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:05 np0005626463.localdomain sudo[26275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:37:05 np0005626463.localdomain sudo[26275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:05 np0005626463.localdomain sudo[26275]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:05 np0005626463.localdomain sshd[26290]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:05 np0005626463.localdomain sshd[26290]: Accepted publickey for ceph-admin from 192.168.122.103 port 37638 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:05 np0005626463.localdomain systemd-logind[759]: New session 21 of user ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Feb 23 07:37:05 np0005626463.localdomain sshd[26290]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:06 np0005626463.localdomain sudo[26294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new
Feb 23 07:37:06 np0005626463.localdomain sudo[26294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:06 np0005626463.localdomain sudo[26294]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:06 np0005626463.localdomain sshd[26309]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:06 np0005626463.localdomain sshd[26309]: Accepted publickey for ceph-admin from 192.168.122.103 port 37646 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:06 np0005626463.localdomain systemd-logind[759]: New session 22 of user ceph-admin.
Feb 23 07:37:06 np0005626463.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Feb 23 07:37:06 np0005626463.localdomain sshd[26309]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:06 np0005626463.localdomain sudo[26313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:37:06 np0005626463.localdomain sudo[26313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:06 np0005626463.localdomain sudo[26313]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:06 np0005626463.localdomain sshd[26328]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:06 np0005626463.localdomain sshd[26328]: Accepted publickey for ceph-admin from 192.168.122.103 port 37656 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:06 np0005626463.localdomain systemd-logind[759]: New session 23 of user ceph-admin.
Feb 23 07:37:06 np0005626463.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Feb 23 07:37:06 np0005626463.localdomain sshd[26328]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:06 np0005626463.localdomain sudo[26332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new
Feb 23 07:37:06 np0005626463.localdomain sudo[26332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:06 np0005626463.localdomain sudo[26332]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:06 np0005626463.localdomain sshd[26347]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:07 np0005626463.localdomain sshd[26347]: Accepted publickey for ceph-admin from 192.168.122.103 port 37660 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:07 np0005626463.localdomain systemd-logind[759]: New session 24 of user ceph-admin.
Feb 23 07:37:07 np0005626463.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Feb 23 07:37:07 np0005626463.localdomain sshd[26347]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:07 np0005626463.localdomain sshd[26364]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:07 np0005626463.localdomain sshd[26364]: Accepted publickey for ceph-admin from 192.168.122.103 port 37674 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:07 np0005626463.localdomain systemd-logind[759]: New session 25 of user ceph-admin.
Feb 23 07:37:07 np0005626463.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Feb 23 07:37:07 np0005626463.localdomain sshd[26364]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:07 np0005626463.localdomain sudo[26368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f
Feb 23 07:37:07 np0005626463.localdomain sudo[26368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:07 np0005626463.localdomain sudo[26368]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:07 np0005626463.localdomain sshd[26383]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:08 np0005626463.localdomain sshd[26383]: Accepted publickey for ceph-admin from 192.168.122.103 port 37682 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 07:37:08 np0005626463.localdomain systemd-logind[759]: New session 26 of user ceph-admin.
Feb 23 07:37:08 np0005626463.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Feb 23 07:37:08 np0005626463.localdomain sshd[26383]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 07:37:08 np0005626463.localdomain sudo[26387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host --expect-hostname np0005626463.localdomain
Feb 23 07:37:08 np0005626463.localdomain sudo[26387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:08 np0005626463.localdomain sudo[26387]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:38 np0005626463.localdomain sudo[26422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:37:38 np0005626463.localdomain sudo[26422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:38 np0005626463.localdomain sudo[26422]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:39 np0005626463.localdomain sudo[26437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:39 np0005626463.localdomain sudo[26437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:39 np0005626463.localdomain sudo[26437]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:39 np0005626463.localdomain sudo[26452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 07:37:39 np0005626463.localdomain sudo[26452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:39 np0005626463.localdomain sudo[26452]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:39 np0005626463.localdomain sudo[26487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:39 np0005626463.localdomain sudo[26487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:39 np0005626463.localdomain sudo[26487]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:39 np0005626463.localdomain sudo[26502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 07:37:39 np0005626463.localdomain sudo[26502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:40 np0005626463.localdomain sudo[26502]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:40 np0005626463.localdomain sudo[26557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:40 np0005626463.localdomain sudo[26557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:40 np0005626463.localdomain sudo[26557]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:40 np0005626463.localdomain sudo[26572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:37:40 np0005626463.localdomain sudo[26572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:40 np0005626463.localdomain sshd[26587]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26601 (sysctl)
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 23 07:37:40 np0005626463.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 23 07:37:40 np0005626463.localdomain sshd[26587]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:37:40 np0005626463.localdomain sudo[26572]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:41 np0005626463.localdomain sudo[26623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:41 np0005626463.localdomain sudo[26623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:41 np0005626463.localdomain sudo[26623]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:41 np0005626463.localdomain sudo[26638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 07:37:41 np0005626463.localdomain sudo[26638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:41 np0005626463.localdomain sudo[26638]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:41 np0005626463.localdomain sudo[26671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:37:41 np0005626463.localdomain sudo[26671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:41 np0005626463.localdomain sudo[26671]: pam_unix(sudo:session): session closed for user root
Feb 23 07:37:41 np0005626463.localdomain sudo[26686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 07:37:41 np0005626463.localdomain sudo[26686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:37:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:37:44 np0005626463.localdomain sshd[26754]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:37:46 np0005626463.localdomain sshd[26754]: Invalid user user from 92.118.39.72 port 45784
Feb 23 07:37:46 np0005626463.localdomain sshd[26754]: Connection closed by invalid user user 92.118.39.72 port 45784 [preauth]
Feb 23 07:37:47 np0005626463.localdomain kernel: VFS: idmapped mount is not enabled.
Feb 23 07:38:11 np0005626463.localdomain podman[26739]: 
Feb 23 07:38:11 np0005626463.localdomain podman[26739]: 2026-02-23 07:38:11.901544641 +0000 UTC m=+29.685957491 container create 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, GIT_BRANCH=main, name=rhceph, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Feb 23 07:38:11 np0005626463.localdomain podman[26739]: 2026-02-23 07:37:42.259044241 +0000 UTC m=+0.043457091 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2997277124-merged.mount: Deactivated successfully.
Feb 23 07:38:11 np0005626463.localdomain systemd[1]: Created slice Slice /machine.
Feb 23 07:38:11 np0005626463.localdomain systemd[1]: Started libpod-conmon-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope.
Feb 23 07:38:11 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:12 np0005626463.localdomain podman[26739]: 2026-02-23 07:38:12.015487488 +0000 UTC m=+29.799900328 container init 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:38:12 np0005626463.localdomain podman[26739]: 2026-02-23 07:38:12.025035648 +0000 UTC m=+29.809448498 container start 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, ceph=True, release=1770267347, RELEASE=main)
Feb 23 07:38:12 np0005626463.localdomain podman[26739]: 2026-02-23 07:38:12.025303777 +0000 UTC m=+29.809716617 container attach 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:38:12 np0005626463.localdomain nifty_grothendieck[27003]: 167 167
Feb 23 07:38:12 np0005626463.localdomain systemd[1]: libpod-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope: Deactivated successfully.
Feb 23 07:38:12 np0005626463.localdomain podman[26739]: 2026-02-23 07:38:12.029406275 +0000 UTC m=+29.813819145 container died 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, version=7)
Feb 23 07:38:12 np0005626463.localdomain podman[27008]: 2026-02-23 07:38:12.105587624 +0000 UTC m=+0.067433226 container remove 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 07:38:12 np0005626463.localdomain systemd[1]: libpod-conmon-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope: Deactivated successfully.
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:12.339531111 +0000 UTC m=+0.080969601 container create 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, build-date=2026-02-09T10:25:24Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, GIT_BRANCH=main)
Feb 23 07:38:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope.
Feb 23 07:38:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:12.300024774 +0000 UTC m=+0.041463274 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:12.423692597 +0000 UTC m=+0.165131097 container init 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True)
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:12.432387919 +0000 UTC m=+0.173826419 container start 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:38:12 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:12.432649008 +0000 UTC m=+0.174087508 container attach 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.)
Feb 23 07:38:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-fb7a309750f49f2c9aff72314ba3c61d17c899c3c3b2d9a9fecc7124a146141a-merged.mount: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]: [
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:     {
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "available": false,
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "ceph_device": false,
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "lsm_data": {},
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "lvs": [],
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "path": "/dev/sr0",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "rejected_reasons": [
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "Insufficient space (<5GB)",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "Has a FileSystem"
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         ],
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         "sys_api": {
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "actuators": null,
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "device_nodes": "sr0",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "human_readable_size": "482.00 KB",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "id_bus": "ata",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "model": "QEMU DVD-ROM",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "nr_requests": "2",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "partitions": {},
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "path": "/dev/sr0",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "removable": "1",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "rev": "2.5+",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "ro": "0",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "rotational": "1",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "sas_address": "",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "sas_device_handle": "",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "scheduler_mode": "mq-deadline",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "sectors": 0,
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "sectorsize": "2048",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "size": 493568.0,
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "support_discard": "0",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "type": "disk",
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:             "vendor": "QEMU"
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:         }
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]:     }
Feb 23 07:38:13 np0005626463.localdomain angry_dirac[27043]: ]
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: libpod-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain podman[27028]: 2026-02-23 07:38:13.197656689 +0000 UTC m=+0.939095239 container died 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph)
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42-merged.mount: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain podman[28412]: 2026-02-23 07:38:13.287119474 +0000 UTC m=+0.075235118 container remove 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git)
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: libpod-conmon-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain sudo[26686]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:13 np0005626463.localdomain sudo[28426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:38:13 np0005626463.localdomain sudo[28426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:13 np0005626463.localdomain sudo[28426]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:13 np0005626463.localdomain sudo[28441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 _orch set-coredump-overrides --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 --coredump-max-size=32G
Feb 23 07:38:13 np0005626463.localdomain sudo[28441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: Closed Process Core Dump Socket.
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: Stopping Process Core Dump Socket...
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: Listening on Process Core Dump Socket.
Feb 23 07:38:13 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:13 np0005626463.localdomain systemd-sysv-generator[28498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:13 np0005626463.localdomain systemd-rc-local-generator[28492]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:14 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:14 np0005626463.localdomain systemd-rc-local-generator[28530]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:14 np0005626463.localdomain systemd-sysv-generator[28534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:14 np0005626463.localdomain sudo[28441]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:31 np0005626463.localdomain sshd[28542]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:38:32 np0005626463.localdomain sshd[28542]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:38:33 np0005626463.localdomain sudo[28544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:38:33 np0005626463.localdomain sudo[28544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:33 np0005626463.localdomain sudo[28544]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:33 np0005626463.localdomain sudo[28559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:38:33 np0005626463.localdomain sudo[28559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.517499148 +0000 UTC m=+0.067506972 container create 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, RELEASE=main, name=rhceph)
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: Started libpod-conmon-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope.
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.588078132 +0000 UTC m=+0.138085976 container init 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.495988971 +0000 UTC m=+0.045996805 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.598645065 +0000 UTC m=+0.148652909 container start 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, architecture=x86_64, RELEASE=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.59892691 +0000 UTC m=+0.148934764 container attach 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, ceph=True)
Feb 23 07:38:34 np0005626463.localdomain tender_ardinghelli[28632]: 167 167
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: libpod-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope: Deactivated successfully.
Feb 23 07:38:34 np0005626463.localdomain podman[28616]: 2026-02-23 07:38:34.602810284 +0000 UTC m=+0.152818158 container died 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Feb 23 07:38:34 np0005626463.localdomain podman[28637]: 2026-02-23 07:38:34.700445296 +0000 UTC m=+0.080817891 container remove 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vcs-type=git, RELEASE=main)
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: libpod-conmon-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope: Deactivated successfully.
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:34 np0005626463.localdomain systemd-rc-local-generator[28676]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:34 np0005626463.localdomain systemd-sysv-generator[28681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:35 np0005626463.localdomain systemd-sysv-generator[28718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:35 np0005626463.localdomain systemd-rc-local-generator[28713]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reached target All Ceph clusters and services.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:35 np0005626463.localdomain systemd-rc-local-generator[28751]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:35 np0005626463.localdomain systemd-sysv-generator[28756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reached target Ceph cluster f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:35 np0005626463.localdomain systemd-sysv-generator[28792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:35 np0005626463.localdomain systemd-rc-local-generator[28789]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:35 np0005626463.localdomain systemd-rc-local-generator[28830]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:35 np0005626463.localdomain systemd-sysv-generator[28835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: Created slice Slice /system/ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: Reached target System Time Set.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: Reached target System Time Synchronized.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: Starting Ceph crash.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 23 07:38:36 np0005626463.localdomain podman[28893]: 
Feb 23 07:38:36 np0005626463.localdomain podman[28893]: 2026-02-23 07:38:36.371389811 +0000 UTC m=+0.078063116 container create fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main)
Feb 23 07:38:36 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:36 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:36 np0005626463.localdomain podman[28893]: 2026-02-23 07:38:36.340234317 +0000 UTC m=+0.046907632 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:36 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/etc/ceph/ceph.client.crash.np0005626463.keyring supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:36 np0005626463.localdomain podman[28893]: 2026-02-23 07:38:36.462938394 +0000 UTC m=+0.169611699 container init fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, ceph=True, vcs-type=git, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 07:38:36 np0005626463.localdomain podman[28893]: 2026-02-23 07:38:36.486172242 +0000 UTC m=+0.192845557 container start fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git)
Feb 23 07:38:36 np0005626463.localdomain bash[28893]: fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234
Feb 23 07:38:36 np0005626463.localdomain systemd[1]: Started Ceph crash.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 07:38:36 np0005626463.localdomain sudo[28559]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:36 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: INFO:ceph-crash:pinging cluster to exercise our key, trying key client.crash.np0005626463.
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:   cluster:
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     id:     f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     health: HEALTH_WARN
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:             OSD count 0 < osd_pool_default_size 3
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:  
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:   services:
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     mon: 3 daemons, quorum np0005626459,np0005626461,np0005626460 (age 5s)
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     mgr: np0005626459.pmtxxl(active, since 2m), standbys: np0005626461.lrfquh
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     osd: 0 osds: 0 up, 0 in
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:  
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:   data:
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     pools:   0 pools, 0 pgs
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     objects: 0 objects, 0 B
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     usage:   0 B used, 0 B / 0 B avail
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     pgs:     
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:  
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:   progress:
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:     Updating crash deployment (+4 -> 6) (0s)
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:       [............................] 
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]:  
Feb 23 07:38:37 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Feb 23 07:38:44 np0005626463.localdomain sudo[28935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:38:44 np0005626463.localdomain sudo[28935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:44 np0005626463.localdomain sudo[28935]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:44 np0005626463.localdomain sudo[28950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Feb 23 07:38:44 np0005626463.localdomain sudo[28950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.841987415 +0000 UTC m=+0.073625934 container create 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 07:38:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope.
Feb 23 07:38:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.811405421 +0000 UTC m=+0.043043940 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.916675262 +0000 UTC m=+0.148313791 container init 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1770267347, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=)
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.926579622 +0000 UTC m=+0.158218161 container start 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.92691117 +0000 UTC m=+0.158549709 container attach 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 07:38:44 np0005626463.localdomain youthful_engelbart[29017]: 167 167
Feb 23 07:38:44 np0005626463.localdomain systemd[1]: libpod-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope: Deactivated successfully.
Feb 23 07:38:44 np0005626463.localdomain podman[29002]: 2026-02-23 07:38:44.929218831 +0000 UTC m=+0.160857350 container died 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, release=1770267347, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph)
Feb 23 07:38:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-cef3fbdac90d616fc0d7c3545aa1aa61742dbae67db60e53c99ac07b9835c7c9-merged.mount: Deactivated successfully.
Feb 23 07:38:45 np0005626463.localdomain podman[29022]: 2026-02-23 07:38:45.016499589 +0000 UTC m=+0.074149470 container remove 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7)
Feb 23 07:38:45 np0005626463.localdomain systemd[1]: libpod-conmon-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope: Deactivated successfully.
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 2026-02-23 07:38:45.236648938 +0000 UTC m=+0.072707935 container create 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph)
Feb 23 07:38:45 np0005626463.localdomain systemd[1]: Started libpod-conmon-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope.
Feb 23 07:38:45 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:45 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 2026-02-23 07:38:45.206657895 +0000 UTC m=+0.042716882 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:45 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:45 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:45 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:45 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 2026-02-23 07:38:45.362070847 +0000 UTC m=+0.198129844 container init 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph)
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 2026-02-23 07:38:45.37299019 +0000 UTC m=+0.209049187 container start 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main)
Feb 23 07:38:45 np0005626463.localdomain podman[29044]: 2026-02-23 07:38:45.373257574 +0000 UTC m=+0.209316611 container attach 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public)
Feb 23 07:38:45 np0005626463.localdomain relaxed_newton[29060]: --> passed data devices: 0 physical, 2 LVM
Feb 23 07:38:45 np0005626463.localdomain relaxed_newton[29060]: --> relative data size: 1.0
Feb 23 07:38:45 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 23 07:38:45 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 3c38c3a7-5c4b-4b97-99e3-119e348f6df6
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 23 07:38:46 np0005626463.localdomain lvm[29114]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 07:38:46 np0005626463.localdomain lvm[29114]: VG ceph_vg0 finished
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]:  stderr: got monmap epoch 3
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: --> Creating keyring file for osd.2
Feb 23 07:38:46 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Feb 23 07:38:47 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Feb 23 07:38:47 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 3c38c3a7-5c4b-4b97-99e3-119e348f6df6 --setuser ceph --setgroup ceph
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]:  stderr: 2026-02-23T07:38:47.061+0000 7f6e37b08a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]:  stderr: 2026-02-23T07:38:47.061+0000 7f6e37b08a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm activate successful for osd ID: 2
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 23 07:38:49 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 79650a5e-2685-4848-a7c4-7cead1e09ea1
Feb 23 07:38:50 np0005626463.localdomain lvm[30046]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 23 07:38:50 np0005626463.localdomain lvm[30046]: VG ceph_vg1 finished
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]:  stderr: got monmap epoch 3
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: --> Creating keyring file for osd.5
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/
Feb 23 07:38:50 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid 79650a5e-2685-4848-a7c4-7cead1e09ea1 --setuser ceph --setgroup ceph
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]:  stderr: 2026-02-23T07:38:50.824+0000 7f44dd1baa80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]:  stderr: 2026-02-23T07:38:50.824+0000 7f44dd1baa80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm activate successful for osd ID: 5
Feb 23 07:38:53 np0005626463.localdomain relaxed_newton[29060]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Feb 23 07:38:53 np0005626463.localdomain systemd[1]: libpod-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Deactivated successfully.
Feb 23 07:38:53 np0005626463.localdomain systemd[1]: libpod-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Consumed 3.706s CPU time.
Feb 23 07:38:53 np0005626463.localdomain podman[30944]: 2026-02-23 07:38:53.490366165 +0000 UTC m=+0.049480497 container died 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 07:38:53 np0005626463.localdomain systemd[1]: tmp-crun.caYVru.mount: Deactivated successfully.
Feb 23 07:38:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3-merged.mount: Deactivated successfully.
Feb 23 07:38:53 np0005626463.localdomain podman[30944]: 2026-02-23 07:38:53.529956612 +0000 UTC m=+0.089070904 container remove 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, ceph=True, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.)
Feb 23 07:38:53 np0005626463.localdomain systemd[1]: libpod-conmon-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Deactivated successfully.
Feb 23 07:38:53 np0005626463.localdomain sudo[28950]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:53 np0005626463.localdomain sudo[30957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:38:53 np0005626463.localdomain sudo[30957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:53 np0005626463.localdomain sudo[30957]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:53 np0005626463.localdomain sudo[30972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- lvm list --format json
Feb 23 07:38:53 np0005626463.localdomain sudo[30972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.2707077 +0000 UTC m=+0.069032852 container create 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, version=7)
Feb 23 07:38:54 np0005626463.localdomain systemd[1]: Started libpod-conmon-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope.
Feb 23 07:38:54 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.335438655 +0000 UTC m=+0.133763807 container init 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_CLEAN=True, vcs-type=git, release=1770267347, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.42.2)
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.242757464 +0000 UTC m=+0.041082616 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.344000926 +0000 UTC m=+0.142326078 container start 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, ceph=True, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.344247318 +0000 UTC m=+0.142572470 container attach 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 07:38:54 np0005626463.localdomain quizzical_chatelet[31041]: 167 167
Feb 23 07:38:54 np0005626463.localdomain systemd[1]: libpod-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope: Deactivated successfully.
Feb 23 07:38:54 np0005626463.localdomain podman[31025]: 2026-02-23 07:38:54.347770503 +0000 UTC m=+0.146095685 container died 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, GIT_CLEAN=True, release=1770267347, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container)
Feb 23 07:38:54 np0005626463.localdomain podman[31046]: 2026-02-23 07:38:54.430979978 +0000 UTC m=+0.071929104 container remove 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64)
Feb 23 07:38:54 np0005626463.localdomain systemd[1]: libpod-conmon-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope: Deactivated successfully.
Feb 23 07:38:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7b9d5611a60fac7d0fb8943e20608c63a05668d286a9d63cf33afc8a831c9bc1-merged.mount: Deactivated successfully.
Feb 23 07:38:54 np0005626463.localdomain podman[31065]: 
Feb 23 07:38:54 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:54.610715747 +0000 UTC m=+0.041389143 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:55 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:55.075754252 +0000 UTC m=+0.506427648 container create f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, vcs-type=git, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public)
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: Started libpod-conmon-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope.
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:55 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:55.172839724 +0000 UTC m=+0.603513120 container init f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, io.openshift.expose-services=, GIT_CLEAN=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Feb 23 07:38:55 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:55.182605147 +0000 UTC m=+0.613278553 container start f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.42.2, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 07:38:55 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:55.182839849 +0000 UTC m=+0.613513245 container attach f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, RELEASE=main, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True)
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]: {
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:     "2": [
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:         {
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "devices": [
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "/dev/loop3"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             ],
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_name": "ceph_lv0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_size": "7511998464",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f1fea371-cb69-578d-a3d0-b5c472a84b46,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=3c38c3a7-5c4b-4b97-99e3-119e348f6df6,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_uuid": "QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "name": "ceph_lv0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "path": "/dev/ceph_vg0/ceph_lv0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "tags": {
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.block_uuid": "QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cephx_lockbox_secret": "",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cluster_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cluster_name": "ceph",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.crush_device_class": "",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.encrypted": "0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osd_fsid": "3c38c3a7-5c4b-4b97-99e3-119e348f6df6",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osd_id": "2",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.type": "block",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.vdo": "0"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             },
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "type": "block",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "vg_name": "ceph_vg0"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:         }
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:     ],
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:     "5": [
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:         {
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "devices": [
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "/dev/loop4"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             ],
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_name": "ceph_lv1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_size": "7511998464",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f1fea371-cb69-578d-a3d0-b5c472a84b46,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=79650a5e-2685-4848-a7c4-7cead1e09ea1,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "lv_uuid": "viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "name": "ceph_lv1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "path": "/dev/ceph_vg1/ceph_lv1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "tags": {
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.block_uuid": "viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cephx_lockbox_secret": "",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cluster_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.cluster_name": "ceph",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.crush_device_class": "",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.encrypted": "0",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osd_fsid": "79650a5e-2685-4848-a7c4-7cead1e09ea1",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osd_id": "5",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.type": "block",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:                 "ceph.vdo": "0"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             },
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "type": "block",
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:             "vg_name": "ceph_vg1"
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:         }
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]:     ]
Feb 23 07:38:55 np0005626463.localdomain flamboyant_curie[31081]: }
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: libpod-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope: Deactivated successfully.
Feb 23 07:38:55 np0005626463.localdomain podman[31065]: 2026-02-23 07:38:55.522629414 +0000 UTC m=+0.953302850 container died f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, io.openshift.expose-services=, distribution-scope=public)
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: tmp-crun.2EhrIm.mount: Deactivated successfully.
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b-merged.mount: Deactivated successfully.
Feb 23 07:38:55 np0005626463.localdomain podman[31090]: 2026-02-23 07:38:55.607859015 +0000 UTC m=+0.075814468 container remove f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=)
Feb 23 07:38:55 np0005626463.localdomain systemd[1]: libpod-conmon-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope: Deactivated successfully.
Feb 23 07:38:55 np0005626463.localdomain sudo[30972]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:55 np0005626463.localdomain sudo[31104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:38:55 np0005626463.localdomain sudo[31104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:55 np0005626463.localdomain sudo[31104]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:55 np0005626463.localdomain sudo[31119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:38:55 np0005626463.localdomain sudo[31119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.343921767 +0000 UTC m=+0.070319580 container create 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph)
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: Started libpod-conmon-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope.
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.409102127 +0000 UTC m=+0.135499940 container init 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.316190733 +0000 UTC m=+0.042588546 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.418974235 +0000 UTC m=+0.145372048 container start 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, release=1770267347, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7)
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.419208707 +0000 UTC m=+0.145606530 container attach 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, name=rhceph, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7)
Feb 23 07:38:56 np0005626463.localdomain funny_colden[31192]: 167 167
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: libpod-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope: Deactivated successfully.
Feb 23 07:38:56 np0005626463.localdomain podman[31177]: 2026-02-23 07:38:56.421580651 +0000 UTC m=+0.147978534 container died 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:38:56 np0005626463.localdomain podman[31197]: 2026-02-23 07:38:56.516911252 +0000 UTC m=+0.083010595 container remove 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1770267347, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z)
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: libpod-conmon-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope: Deactivated successfully.
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3dba304662f17f95e27d680ba2f2ef51ce43633d3acbeb54c4928dd0f4233264-merged.mount: Deactivated successfully.
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:56.854195616 +0000 UTC m=+0.076771299 container create 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, GIT_BRANCH=main, ceph=True, architecture=x86_64)
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: Started libpod-conmon-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope.
Feb 23 07:38:56 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:56.824785653 +0000 UTC m=+0.047361326 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:56.983035215 +0000 UTC m=+0.205610888 container init 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, CEPH_POINT_RELEASE=)
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:56.993984388 +0000 UTC m=+0.216560071 container start 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=)
Feb 23 07:38:56 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:56.99419587 +0000 UTC m=+0.216771543 container attach 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, release=1770267347, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:38:57 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 23 07:38:57 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]:                             [--no-systemd] [--no-tmpfs]
Feb 23 07:38:57 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: libpod-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope: Deactivated successfully.
Feb 23 07:38:57 np0005626463.localdomain podman[31224]: 2026-02-23 07:38:57.205623211 +0000 UTC m=+0.428198944 container died 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph)
Feb 23 07:38:57 np0005626463.localdomain podman[31245]: 2026-02-23 07:38:57.279413052 +0000 UTC m=+0.064040250 container remove 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main)
Feb 23 07:38:57 np0005626463.localdomain systemd-journald[618]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Feb 23 07:38:57 np0005626463.localdomain systemd-journald[618]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 07:38:57 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: libpod-conmon-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope: Deactivated successfully.
Feb 23 07:38:57 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:57 np0005626463.localdomain systemd-sysv-generator[31307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:57 np0005626463.localdomain systemd-rc-local-generator[31301]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30-merged.mount: Deactivated successfully.
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: tmp-crun.tseIDR.mount: Deactivated successfully.
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:38:57 np0005626463.localdomain systemd-rc-local-generator[31344]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:38:57 np0005626463.localdomain systemd-sysv-generator[31350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:38:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:38:58 np0005626463.localdomain systemd[1]: Starting Ceph osd.2 for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:58.45535684 +0000 UTC m=+0.075271810 container create 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, distribution-scope=public, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Feb 23 07:38:58 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:58.423658797 +0000 UTC m=+0.043573767 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:58 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:58 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:58 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:58 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:58 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:58.584178018 +0000 UTC m=+0.204092998 container init 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:58.594352552 +0000 UTC m=+0.214267532 container start 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1770267347, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z)
Feb 23 07:38:58 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:58.594584564 +0000 UTC m=+0.214499574 container attach 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64)
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 23 07:38:59 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: --> ceph-volume raw activate successful for osd ID: 2
Feb 23 07:38:59 np0005626463.localdomain bash[31410]: --> ceph-volume raw activate successful for osd ID: 2
Feb 23 07:38:59 np0005626463.localdomain systemd[1]: libpod-27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41.scope: Deactivated successfully.
Feb 23 07:38:59 np0005626463.localdomain podman[31410]: 2026-02-23 07:38:59.347653189 +0000 UTC m=+0.967568149 container died 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git)
Feb 23 07:38:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed-merged.mount: Deactivated successfully.
Feb 23 07:38:59 np0005626463.localdomain podman[31554]: 2026-02-23 07:38:59.433315482 +0000 UTC m=+0.073312727 container remove 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, version=7, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 07:38:59 np0005626463.localdomain podman[31615]: 
Feb 23 07:38:59 np0005626463.localdomain podman[31615]: 2026-02-23 07:38:59.755111333 +0000 UTC m=+0.073650254 container create 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64)
Feb 23 07:38:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:59 np0005626463.localdomain podman[31615]: 2026-02-23 07:38:59.727471873 +0000 UTC m=+0.046010824 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:38:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:38:59 np0005626463.localdomain podman[31615]: 2026-02-23 07:38:59.878944309 +0000 UTC m=+0.197483240 container init 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z)
Feb 23 07:38:59 np0005626463.localdomain podman[31615]: 2026-02-23 07:38:59.88754034 +0000 UTC m=+0.206079271 container start 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.42.2, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 07:38:59 np0005626463.localdomain bash[31615]: 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d
Feb 23 07:38:59 np0005626463.localdomain systemd[1]: Started Ceph osd.2 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 07:38:59 np0005626463.localdomain sudo[31119]: pam_unix(sudo:session): session closed for user root
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: pidfile_write: ignore empty --pid-file
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 23 07:38:59 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close
Feb 23 07:39:00 np0005626463.localdomain sudo[31646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:00 np0005626463.localdomain sudo[31646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:00 np0005626463.localdomain sudo[31646]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:00 np0005626463.localdomain sudo[31661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 07:39:00 np0005626463.localdomain sudo[31661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) close
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: load: jerasure load: lrc 
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:00 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.803737172 +0000 UTC m=+0.067161584 container create 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container)
Feb 23 07:39:00 np0005626463.localdomain systemd[1]: Started libpod-conmon-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope.
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.77452016 +0000 UTC m=+0.037944572 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:00 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.892626495 +0000 UTC m=+0.156050907 container init 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main)
Feb 23 07:39:00 np0005626463.localdomain systemd[1]: tmp-crun.QghnOo.mount: Deactivated successfully.
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.904690428 +0000 UTC m=+0.168114840 container start 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.905169693 +0000 UTC m=+0.168594105 container attach 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main)
Feb 23 07:39:00 np0005626463.localdomain magical_gould[31745]: 167 167
Feb 23 07:39:00 np0005626463.localdomain systemd[1]: libpod-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope: Deactivated successfully.
Feb 23 07:39:00 np0005626463.localdomain podman[31726]: 2026-02-23 07:39:00.909023565 +0000 UTC m=+0.172448047 container died 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, distribution-scope=public, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:39:01 np0005626463.localdomain podman[31750]: 2026-02-23 07:39:01.005406011 +0000 UTC m=+0.081117406 container remove 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: libpod-conmon-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope: Deactivated successfully.
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs mount
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs mount shared_bdev_used = 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: RocksDB version: 7.9.2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Git sha 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DB SUMMARY
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DB Session ID:  T252W6HLUGEFWZ0H917R
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: CURRENT file:  CURRENT
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.error_if_exists: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.create_if_missing: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                     Options.env: 0x557957a9fe30
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                Options.info_log: 0x557956cc6740
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.statistics: (nil)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.use_fsync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.db_log_dir: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                 Options.wal_dir: db.wal
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.write_buffer_manager: 0x557956c6f4a0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.unordered_write: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.row_cache: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.wal_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.two_write_queues: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.wal_compression: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.atomic_flush: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_background_jobs: 4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_background_compactions: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_subcompactions: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.max_open_files: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Compression algorithms supported:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZSTD supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kXpressCompression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kBZip2Compression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kLZ4Compression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZlibCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kSnappyCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 30259abc-06cf-4764-a5f7-1c462702ec25
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341059543, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341059832, "job": 1, "event": "recovery_finished"}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: freelist init
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: freelist _read_cfg
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs umount
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) close
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs mount
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluefs mount shared_bdev_used = 4718592
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: RocksDB version: 7.9.2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Git sha 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DB SUMMARY
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DB Session ID:  T252W6HLUGEFWZ0H917Q
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: CURRENT file:  CURRENT
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.error_if_exists: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.create_if_missing: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                     Options.env: 0x557957ac0380
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                Options.info_log: 0x557956cc68c0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.statistics: (nil)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.use_fsync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.db_log_dir: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                                 Options.wal_dir: db.wal
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.write_buffer_manager: 0x557956c6f4a0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.unordered_write: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.row_cache: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                              Options.wal_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.two_write_queues: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.wal_compression: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.atomic_flush: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_background_jobs: 4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_background_compactions: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_subcompactions: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.max_open_files: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Compression algorithms supported:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZSTD supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kXpressCompression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kBZip2Compression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kLZ4Compression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kZlibCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         kSnappyCompression supported: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x557956c5c2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 30259abc-06cf-4764-a5f7-1c462702ec25
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341339176, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341344340, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341348734, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341353551, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341357946, "job": 1, "event": "recovery_finished"}
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.369306731 +0000 UTC m=+0.089500016 container create b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7)
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557956d1e700
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: DB pointer 0x557956da3a00
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 2.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: _get_class not permitted to load lua
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: _get_class not permitted to load sdk
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: _get_class not permitted to load test_remote_reads
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 load_pgs
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 load_pgs opened 0 pgs
Feb 23 07:39:01 np0005626463.localdomain ceph-osd[31633]: osd.2 0 log_to_monitors true
Feb 23 07:39:01 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:01.397+0000 7feb3f211a80 -1 osd.2 0 log_to_monitors true
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: Started libpod-conmon-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope.
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.339147319 +0000 UTC m=+0.059340604 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:01 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:01 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:01 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:01 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:01 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.502255606 +0000 UTC m=+0.222448871 container init b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, version=7, GIT_BRANCH=main, ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.512987609 +0000 UTC m=+0.233180904 container start b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vcs-type=git, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.513259243 +0000 UTC m=+0.233452538 container attach b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True)
Feb 23 07:39:01 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 23 07:39:01 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]:                             [--no-systemd] [--no-tmpfs]
Feb 23 07:39:01 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: libpod-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope: Deactivated successfully.
Feb 23 07:39:01 np0005626463.localdomain podman[31971]: 2026-02-23 07:39:01.773380221 +0000 UTC m=+0.493573506 container died b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5a37738ab23f69cfa016b219998c51db9748d85f1b0540dfceedf387378eb99f-merged.mount: Deactivated successfully.
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: tmp-crun.uBmgG4.mount: Deactivated successfully.
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d-merged.mount: Deactivated successfully.
Feb 23 07:39:01 np0005626463.localdomain podman[32205]: 2026-02-23 07:39:01.86164332 +0000 UTC m=+0.077261134 container remove b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, name=rhceph)
Feb 23 07:39:01 np0005626463.localdomain systemd[1]: libpod-conmon-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope: Deactivated successfully.
Feb 23 07:39:02 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:39:02 np0005626463.localdomain systemd-rc-local-generator[32257]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:39:02 np0005626463.localdomain systemd-sysv-generator[32264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:39:02 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:39:02 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 23 07:39:02 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 23 07:39:02 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:39:02 np0005626463.localdomain systemd-sysv-generator[32308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:39:02 np0005626463.localdomain systemd-rc-local-generator[32303]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:39:02 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:39:02 np0005626463.localdomain systemd[1]: Starting Ceph osd.5 for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:03.109811445 +0000 UTC m=+0.082519130 container create 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1770267347, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z)
Feb 23 07:39:03 np0005626463.localdomain systemd[1]: tmp-crun.qguPYS.mount: Deactivated successfully.
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:03.07653387 +0000 UTC m=+0.049241555 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:03 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 done with init, starting boot process
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 start_boot
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 23 07:39:03 np0005626463.localdomain ceph-osd[31633]: osd.2 0  bench count 12288000 bsize 4 KiB
Feb 23 07:39:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:03.26585109 +0000 UTC m=+0.238558765 container init 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:03.291990352 +0000 UTC m=+0.264698027 container start 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:39:03 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:03.292312789 +0000 UTC m=+0.265020464 container attach 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Feb 23 07:39:03 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:39:03 np0005626463.localdomain bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:39:03 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 23 07:39:03 np0005626463.localdomain bash[32366]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 23 07:39:04 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 23 07:39:04 np0005626463.localdomain bash[32366]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 23 07:39:04 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 23 07:39:04 np0005626463.localdomain bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 23 07:39:04 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:04 np0005626463.localdomain bash[32366]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:04 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:39:04 np0005626463.localdomain bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 23 07:39:04 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: --> ceph-volume raw activate successful for osd ID: 5
Feb 23 07:39:04 np0005626463.localdomain bash[32366]: --> ceph-volume raw activate successful for osd ID: 5
Feb 23 07:39:04 np0005626463.localdomain systemd[1]: libpod-8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077.scope: Deactivated successfully.
Feb 23 07:39:04 np0005626463.localdomain podman[32366]: 2026-02-23 07:39:04.071314804 +0000 UTC m=+1.044022459 container died 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, distribution-scope=public)
Feb 23 07:39:04 np0005626463.localdomain systemd[1]: tmp-crun.A2j4RE.mount: Deactivated successfully.
Feb 23 07:39:04 np0005626463.localdomain systemd[26179]: Starting Mark boot as successful...
Feb 23 07:39:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0-merged.mount: Deactivated successfully.
Feb 23 07:39:04 np0005626463.localdomain systemd[26179]: Finished Mark boot as successful.
Feb 23 07:39:04 np0005626463.localdomain podman[32495]: 2026-02-23 07:39:04.188064468 +0000 UTC m=+0.104364166 container remove 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 07:39:04 np0005626463.localdomain podman[32556]: 
Feb 23 07:39:04 np0005626463.localdomain podman[32556]: 2026-02-23 07:39:04.56438526 +0000 UTC m=+0.091005126 container create 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 07:39:04 np0005626463.localdomain podman[32556]: 2026-02-23 07:39:04.524008141 +0000 UTC m=+0.050628027 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:04 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:04 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:04 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:04 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:04 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:04 np0005626463.localdomain podman[32556]: 2026-02-23 07:39:04.698105245 +0000 UTC m=+0.224725111 container init 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main)
Feb 23 07:39:04 np0005626463.localdomain podman[32556]: 2026-02-23 07:39:04.71803438 +0000 UTC m=+0.244654236 container start 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 07:39:04 np0005626463.localdomain bash[32556]: 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e
Feb 23 07:39:04 np0005626463.localdomain systemd[1]: Started Ceph osd.5 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: pidfile_write: ignore empty --pid-file
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 23 07:39:04 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) close
Feb 23 07:39:04 np0005626463.localdomain sudo[31661]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:04 np0005626463.localdomain sudo[32588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:04 np0005626463.localdomain sudo[32588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:04 np0005626463.localdomain sudo[32588]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:04 np0005626463.localdomain sudo[32603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- raw list --format json
Feb 23 07:39:04 np0005626463.localdomain sudo[32603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: load: jerasure load: lrc 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.57416127 +0000 UTC m=+0.080703824 container create d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2026-02-09T10:25:24Z)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close
Feb 23 07:39:05 np0005626463.localdomain systemd[1]: Started libpod-conmon-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope.
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.53832073 +0000 UTC m=+0.044863274 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:05 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.667406263 +0000 UTC m=+0.173948877 container init d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph)
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.676406495 +0000 UTC m=+0.182949039 container start d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.676612945 +0000 UTC m=+0.183155489 container attach d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True)
Feb 23 07:39:05 np0005626463.localdomain lucid_swartz[32679]: 167 167
Feb 23 07:39:05 np0005626463.localdomain systemd[1]: libpod-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope: Deactivated successfully.
Feb 23 07:39:05 np0005626463.localdomain podman[32661]: 2026-02-23 07:39:05.682328005 +0000 UTC m=+0.188870579 container died d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, io.buildah.version=1.42.2, GIT_BRANCH=main)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 30.720 iops: 7864.273 elapsed_sec: 0.381
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [WRN] : OSD bench result of 7864.273296 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 0 waiting for initial osdmap
Feb 23 07:39:05 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:05.686+0000 7feb3b9a5640 -1 osd.2 0 waiting for initial osdmap
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 crush map has features 288514050185494528, adjusting msgr requires for clients
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 crush map has features 3314932999778484224, adjusting msgr requires for osds
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 check_osdmap_features require_osd_release unknown -> reef
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 set_numa_affinity not setting numa affinity
Feb 23 07:39:05 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:05.705+0000 7feb367ba640 -1 osd.2 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[31633]: osd.2 12 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Feb 23 07:39:05 np0005626463.localdomain podman[32684]: 2026-02-23 07:39:05.790012784 +0000 UTC m=+0.091857030 container remove d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2)
Feb 23 07:39:05 np0005626463.localdomain systemd[1]: libpod-conmon-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope: Deactivated successfully.
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluefs mount
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluefs mount shared_bdev_used = 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: RocksDB version: 7.9.2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Git sha 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: DB SUMMARY
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: DB Session ID:  VGZ6RC4M9VF4RKBK61PE
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: CURRENT file:  CURRENT
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.error_if_exists: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.create_if_missing: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                     Options.env: 0x564b56f51c70
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                Options.info_log: 0x564b570d07e0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.statistics: (nil)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.use_fsync: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.db_log_dir: 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                 Options.wal_dir: db.wal
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.write_buffer_manager: 0x564b56116140
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.unordered_write: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.row_cache: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.wal_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.two_write_queues: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.wal_compression: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.atomic_flush: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_background_jobs: 4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_background_compactions: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_subcompactions: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.max_open_files: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Compression algorithms supported:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZSTD supported: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kXpressCompression supported: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kBZip2Compression supported: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kLZ4Compression supported: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZlibCompression supported: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kSnappyCompression supported: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56104850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fbdd6168-4234-4c0e-b38b-7b6ed96d1042
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832345888137, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832345888404, "job": 1, "event": "recovery_finished"}
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: freelist init
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: freelist _read_cfg
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bluefs umount
Feb 23 07:39:05 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) close
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:06.018779635 +0000 UTC m=+0.081642084 container create 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, distribution-scope=public, RELEASE=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git)
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: Started libpod-conmon-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope.
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:05.989455126 +0000 UTC m=+0.052317565 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:06 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:06 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:06 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:06.134170878 +0000 UTC m=+0.197033317 container init 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3cac4e07189b1dadf83d6b52eea540775735b7fd67deefac60dc012da60a41b5-merged.mount: Deactivated successfully.
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluefs mount
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluefs mount shared_bdev_used = 4718592
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:06.1506109 +0000 UTC m=+0.213473319 container start 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z)
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:06.150842812 +0000 UTC m=+0.213705231 container attach 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, release=1770267347, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc.)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: RocksDB version: 7.9.2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Git sha 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: DB SUMMARY
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: DB Session ID:  VGZ6RC4M9VF4RKBK61PF
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: CURRENT file:  CURRENT
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.error_if_exists: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.create_if_missing: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                     Options.env: 0x564b562522a0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                Options.info_log: 0x564b571440c0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.statistics: (nil)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.use_fsync: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.db_log_dir: 
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                                 Options.wal_dir: db.wal
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.write_buffer_manager: 0x564b56116140
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.unordered_write: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.row_cache: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                              Options.wal_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.two_write_queues: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.wal_compression: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.atomic_flush: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_background_jobs: 4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_background_compactions: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_subcompactions: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.max_open_files: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Compression algorithms supported:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZSTD supported: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kXpressCompression supported: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kBZip2Compression supported: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kLZ4Compression supported: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kZlibCompression supported: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         kSnappyCompression supported: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b561042d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56105610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56105610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:           Options.merge_operator: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x564b56105610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.write_buffer_size: 16777216
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.max_write_buffer_number: 64
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.compression: LZ4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.num_levels: 7
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.bloom_locality: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                               Options.ttl: 2592000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                       Options.enable_blob_files: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                           Options.min_blob_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fbdd6168-4234-4c0e-b38b-7b6ed96d1042
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346200632, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346207914, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346213050, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346217702, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346222637, "job": 1, "event": "recovery_finished"}
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564b56242700
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: DB pointer 0x564b57025a00
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: _get_class not permitted to load lua
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: _get_class not permitted to load sdk
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: _get_class not permitted to load test_remote_reads
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 load_pgs
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 load_pgs opened 0 pgs
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[32575]: osd.5 0 log_to_monitors true
Feb 23 07:39:06 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:06.268+0000 7f8bc82f7a80 -1 osd.5 0 log_to_monitors true
Feb 23 07:39:06 np0005626463.localdomain ceph-osd[31633]: osd.2 13 state: booting -> active
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]: {
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:     "3c38c3a7-5c4b-4b97-99e3-119e348f6df6": {
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "ceph_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "osd_id": 2,
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "osd_uuid": "3c38c3a7-5c4b-4b97-99e3-119e348f6df6",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "type": "bluestore"
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:     },
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:     "79650a5e-2685-4848-a7c4-7cead1e09ea1": {
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "ceph_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "osd_id": 5,
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "osd_uuid": "79650a5e-2685-4848-a7c4-7cead1e09ea1",
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:         "type": "bluestore"
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]:     }
Feb 23 07:39:06 np0005626463.localdomain frosty_lewin[32916]: }
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: libpod-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope: Deactivated successfully.
Feb 23 07:39:06 np0005626463.localdomain podman[32901]: 2026-02-23 07:39:06.788509204 +0000 UTC m=+0.851371683 container died 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph)
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66-merged.mount: Deactivated successfully.
Feb 23 07:39:06 np0005626463.localdomain podman[33168]: 2026-02-23 07:39:06.890688373 +0000 UTC m=+0.092616389 container remove 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64)
Feb 23 07:39:06 np0005626463.localdomain systemd[1]: libpod-conmon-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope: Deactivated successfully.
Feb 23 07:39:06 np0005626463.localdomain sudo[32603]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 23 07:39:07 np0005626463.localdomain sudo[33183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:39:07 np0005626463.localdomain sudo[33183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:07 np0005626463.localdomain sudo[33183]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 done with init, starting boot process
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 start_boot
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 23 07:39:07 np0005626463.localdomain ceph-osd[32575]: osd.5 0  bench count 12288000 bsize 4 KiB
Feb 23 07:39:07 np0005626463.localdomain sudo[33198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:07 np0005626463.localdomain sudo[33198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:07 np0005626463.localdomain sudo[33198]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:07 np0005626463.localdomain sudo[33213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 07:39:07 np0005626463.localdomain sudo[33213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:08 np0005626463.localdomain podman[33297]: 2026-02-23 07:39:08.507561801 +0000 UTC m=+0.110880327 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Feb 23 07:39:08 np0005626463.localdomain podman[33297]: 2026-02-23 07:39:08.630634108 +0000 UTC m=+0.233952614 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, release=1770267347, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public)
Feb 23 07:39:08 np0005626463.localdomain ceph-osd[31633]: osd.2 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 23 07:39:08 np0005626463.localdomain ceph-osd[31633]: osd.2 15 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Feb 23 07:39:08 np0005626463.localdomain ceph-osd[31633]: osd.2 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 23 07:39:08 np0005626463.localdomain sudo[33213]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:09 np0005626463.localdomain sudo[33362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:09 np0005626463.localdomain sudo[33362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:09 np0005626463.localdomain sudo[33362]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:09 np0005626463.localdomain sudo[33377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:39:09 np0005626463.localdomain sudo[33377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:09 np0005626463.localdomain sudo[33377]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:09 np0005626463.localdomain sudo[33423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:09 np0005626463.localdomain sudo[33423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:09 np0005626463.localdomain sudo[33423]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:09 np0005626463.localdomain sudo[33438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 07:39:09 np0005626463.localdomain sudo[33438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 28.360 iops: 7260.109 elapsed_sec: 0.413
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [WRN] : OSD bench result of 7260.108975 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 0 waiting for initial osdmap
Feb 23 07:39:10 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:10.372+0000 7f8bc4a8b640 -1 osd.5 0 waiting for initial osdmap
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 check_osdmap_features require_osd_release unknown -> reef
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 23 07:39:10 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:10.394+0000 7f8bbf8a0640 -1 osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 set_numa_affinity not setting numa affinity
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 16 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.487401961 +0000 UTC m=+0.060927168 container create fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2)
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope.
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.458040921 +0000 UTC m=+0.031566178 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.559840661 +0000 UTC m=+0.133365868 container init fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.570899441 +0000 UTC m=+0.144424648 container start fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main)
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.571392977 +0000 UTC m=+0.144918214 container attach fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, RELEASE=main, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2)
Feb 23 07:39:10 np0005626463.localdomain busy_dewdney[33510]: 167 167
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: libpod-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope: Deactivated successfully.
Feb 23 07:39:10 np0005626463.localdomain podman[33495]: 2026-02-23 07:39:10.575042448 +0000 UTC m=+0.148567705 container died fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f3a00d67f949491969554252455383321180dec115fef8603cccfb97e1e81e82-merged.mount: Deactivated successfully.
Feb 23 07:39:10 np0005626463.localdomain podman[33515]: 2026-02-23 07:39:10.663686288 +0000 UTC m=+0.079257648 container remove fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: libpod-conmon-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope: Deactivated successfully.
Feb 23 07:39:10 np0005626463.localdomain ceph-osd[32575]: osd.5 17 state: booting -> active
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:10.870712849 +0000 UTC m=+0.071303482 container create c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, release=1770267347, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope.
Feb 23 07:39:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:39:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:10.842078397 +0000 UTC m=+0.042669470 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 07:39:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:10.970684853 +0000 UTC m=+0.171275486 container init c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:10.981480839 +0000 UTC m=+0.182071482 container start c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, RELEASE=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 07:39:10 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:10.981765815 +0000 UTC m=+0.182356448 container attach c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, ceph=True, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]: [
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:     {
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "available": false,
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "ceph_device": false,
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "lsm_data": {},
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "lvs": [],
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "path": "/dev/sr0",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "rejected_reasons": [
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "Insufficient space (<5GB)",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "Has a FileSystem"
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         ],
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         "sys_api": {
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "actuators": null,
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "device_nodes": "sr0",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "human_readable_size": "482.00 KB",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "id_bus": "ata",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "model": "QEMU DVD-ROM",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "nr_requests": "2",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "partitions": {},
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "path": "/dev/sr0",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "removable": "1",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "rev": "2.5+",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "ro": "0",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "rotational": "1",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "sas_address": "",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "sas_device_handle": "",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "scheduler_mode": "mq-deadline",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "sectors": 0,
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "sectorsize": "2048",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "size": 493568.0,
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "support_discard": "0",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "type": "disk",
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:             "vendor": "QEMU"
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:         }
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]:     }
Feb 23 07:39:11 np0005626463.localdomain elated_proskuriakova[33549]: ]
Feb 23 07:39:11 np0005626463.localdomain systemd[1]: libpod-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope: Deactivated successfully.
Feb 23 07:39:11 np0005626463.localdomain podman[33534]: 2026-02-23 07:39:11.80976196 +0000 UTC m=+1.010352603 container died c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 07:39:11 np0005626463.localdomain systemd[1]: tmp-crun.jTKsZx.mount: Deactivated successfully.
Feb 23 07:39:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557-merged.mount: Deactivated successfully.
Feb 23 07:39:11 np0005626463.localdomain podman[34766]: 2026-02-23 07:39:11.897433288 +0000 UTC m=+0.077465594 container remove c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, architecture=x86_64)
Feb 23 07:39:11 np0005626463.localdomain systemd[1]: libpod-conmon-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope: Deactivated successfully.
Feb 23 07:39:11 np0005626463.localdomain sudo[33438]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:12 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=17) [3,5,4] r=1 lpr=17 pi=[15,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:39:13 np0005626463.localdomain sudo[34778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:39:13 np0005626463.localdomain sudo[34778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:13 np0005626463.localdomain sudo[34778]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:15 np0005626463.localdomain sshd[34793]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:39:15 np0005626463.localdomain sshd[34793]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:39:20 np0005626463.localdomain sudo[34795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:39:20 np0005626463.localdomain sudo[34795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:20 np0005626463.localdomain sudo[34795]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:20 np0005626463.localdomain sudo[34810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 07:39:20 np0005626463.localdomain sudo[34810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:20 np0005626463.localdomain podman[34895]: 2026-02-23 07:39:20.987046892 +0000 UTC m=+0.080644251 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_CLEAN=True, vcs-type=git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 07:39:21 np0005626463.localdomain podman[34895]: 2026-02-23 07:39:21.093222486 +0000 UTC m=+0.186819915 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1770267347)
Feb 23 07:39:21 np0005626463.localdomain sudo[34810]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:21 np0005626463.localdomain sudo[34963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:39:21 np0005626463.localdomain sudo[34963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:39:21 np0005626463.localdomain sudo[34963]: pam_unix(sudo:session): session closed for user root
Feb 23 07:39:58 np0005626463.localdomain sshd[34978]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:39:58 np0005626463.localdomain sshd[34978]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:40:22 np0005626463.localdomain sudo[34980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:40:22 np0005626463.localdomain sudo[34980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:40:22 np0005626463.localdomain sudo[34980]: pam_unix(sudo:session): session closed for user root
Feb 23 07:40:22 np0005626463.localdomain sudo[34995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 07:40:22 np0005626463.localdomain sudo[34995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:40:22 np0005626463.localdomain systemd[1]: tmp-crun.16wQsG.mount: Deactivated successfully.
Feb 23 07:40:22 np0005626463.localdomain podman[35081]: 2026-02-23 07:40:22.936717138 +0000 UTC m=+0.092625787 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Feb 23 07:40:23 np0005626463.localdomain podman[35081]: 2026-02-23 07:40:23.050702434 +0000 UTC m=+0.206611083 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 07:40:23 np0005626463.localdomain sudo[34995]: pam_unix(sudo:session): session closed for user root
Feb 23 07:40:23 np0005626463.localdomain sudo[35145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:40:23 np0005626463.localdomain sudo[35145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:40:23 np0005626463.localdomain sudo[35145]: pam_unix(sudo:session): session closed for user root
Feb 23 07:40:23 np0005626463.localdomain sudo[35160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:40:23 np0005626463.localdomain sudo[35160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:40:24 np0005626463.localdomain sudo[35160]: pam_unix(sudo:session): session closed for user root
Feb 23 07:40:24 np0005626463.localdomain sudo[35207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:40:24 np0005626463.localdomain sudo[35207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:40:24 np0005626463.localdomain sudo[35207]: pam_unix(sudo:session): session closed for user root
Feb 23 07:40:27 np0005626463.localdomain sshd[24669]: Received disconnect from 192.168.122.100 port 44382:11: disconnected by user
Feb 23 07:40:27 np0005626463.localdomain sshd[24669]: Disconnected from user zuul 192.168.122.100 port 44382
Feb 23 07:40:27 np0005626463.localdomain sshd[24666]: pam_unix(sshd:session): session closed for user zuul
Feb 23 07:40:27 np0005626463.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Feb 23 07:40:27 np0005626463.localdomain systemd[1]: session-13.scope: Consumed 20.999s CPU time.
Feb 23 07:40:27 np0005626463.localdomain systemd-logind[759]: Session 13 logged out. Waiting for processes to exit.
Feb 23 07:40:27 np0005626463.localdomain systemd-logind[759]: Removed session 13.
Feb 23 07:40:41 np0005626463.localdomain sshd[35222]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:40:41 np0005626463.localdomain sshd[35222]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:41:24 np0005626463.localdomain sshd[35224]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:41:24 np0005626463.localdomain sudo[35226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:41:24 np0005626463.localdomain sudo[35226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:41:24 np0005626463.localdomain sudo[35226]: pam_unix(sudo:session): session closed for user root
Feb 23 07:41:24 np0005626463.localdomain sshd[35224]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:41:24 np0005626463.localdomain sudo[35241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:41:24 np0005626463.localdomain sudo[35241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:41:25 np0005626463.localdomain sudo[35241]: pam_unix(sudo:session): session closed for user root
Feb 23 07:41:25 np0005626463.localdomain sudo[35289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:41:25 np0005626463.localdomain sudo[35289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:41:25 np0005626463.localdomain sudo[35289]: pam_unix(sudo:session): session closed for user root
Feb 23 07:42:08 np0005626463.localdomain sshd[35304]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:42:09 np0005626463.localdomain sshd[35304]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:42:26 np0005626463.localdomain sudo[35306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:42:26 np0005626463.localdomain sudo[35306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:42:26 np0005626463.localdomain sudo[35306]: pam_unix(sudo:session): session closed for user root
Feb 23 07:42:26 np0005626463.localdomain sudo[35321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:42:26 np0005626463.localdomain sudo[35321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:42:26 np0005626463.localdomain sudo[35321]: pam_unix(sudo:session): session closed for user root
Feb 23 07:42:28 np0005626463.localdomain sudo[35367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:42:28 np0005626463.localdomain sudo[35367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:42:28 np0005626463.localdomain sudo[35367]: pam_unix(sudo:session): session closed for user root
Feb 23 07:42:47 np0005626463.localdomain systemd[26179]: Created slice User Background Tasks Slice.
Feb 23 07:42:47 np0005626463.localdomain systemd[26179]: Starting Cleanup of User's Temporary Files and Directories...
Feb 23 07:42:47 np0005626463.localdomain systemd[26179]: Finished Cleanup of User's Temporary Files and Directories.
Feb 23 07:42:54 np0005626463.localdomain sshd[35383]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:42:54 np0005626463.localdomain sshd[35383]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:42:57 np0005626463.localdomain sshd[35385]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:42:57 np0005626463.localdomain sshd[35385]: error: kex_exchange_identification: read: Connection reset by peer
Feb 23 07:42:57 np0005626463.localdomain sshd[35385]: Connection reset by 165.245.131.32 port 52082
Feb 23 07:42:58 np0005626463.localdomain sshd[35386]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:42:58 np0005626463.localdomain sshd[35386]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:43:32 np0005626463.localdomain sudo[35388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:43:32 np0005626463.localdomain sudo[35388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:43:32 np0005626463.localdomain sudo[35388]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:32 np0005626463.localdomain sudo[35403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:43:32 np0005626463.localdomain sudo[35403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:43:33 np0005626463.localdomain sudo[35403]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:34 np0005626463.localdomain sudo[35449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:43:34 np0005626463.localdomain sudo[35449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:43:34 np0005626463.localdomain sudo[35449]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:37 np0005626463.localdomain sshd[35464]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:43:38 np0005626463.localdomain sshd[35464]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:43:55 np0005626463.localdomain sshd[35466]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:43:55 np0005626463.localdomain sshd[35466]: Accepted publickey for zuul from 192.168.122.100 port 36462 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:43:55 np0005626463.localdomain systemd-logind[759]: New session 27 of user zuul.
Feb 23 07:43:56 np0005626463.localdomain systemd[1]: Started Session 27 of User zuul.
Feb 23 07:43:56 np0005626463.localdomain sshd[35466]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 07:43:56 np0005626463.localdomain sudo[35512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybsjhpclygoprwmyqcugjonpvxwtknei ; /usr/bin/python3
Feb 23 07:43:56 np0005626463.localdomain sudo[35512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:56 np0005626463.localdomain python3[35514]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 23 07:43:56 np0005626463.localdomain sudo[35512]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:57 np0005626463.localdomain sudo[35557]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axjyewidzfhdxsahaikhppdejvfcrcsi ; /usr/bin/python3
Feb 23 07:43:57 np0005626463.localdomain sudo[35557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:57 np0005626463.localdomain python3[35559]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 07:43:57 np0005626463.localdomain sudo[35557]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:57 np0005626463.localdomain sudo[35577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pssgbpiviavkhkmkperotnhtgpebhhsc ; /usr/bin/python3
Feb 23 07:43:57 np0005626463.localdomain sudo[35577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:57 np0005626463.localdomain python3[35579]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 23 07:43:57 np0005626463.localdomain useradd[35581]: new group: name=tripleo-admin, GID=1003
Feb 23 07:43:57 np0005626463.localdomain useradd[35581]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Feb 23 07:43:57 np0005626463.localdomain sudo[35577]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:58 np0005626463.localdomain sudo[35633]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juablyihpieknanfdzxsultufofzgbws ; /usr/bin/python3
Feb 23 07:43:58 np0005626463.localdomain sudo[35633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:58 np0005626463.localdomain python3[35636]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:43:58 np0005626463.localdomain sudo[35633]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:58 np0005626463.localdomain sudo[35677]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfrdvjgzljpfvgyixganpezhrxfplxkq ; /usr/bin/python3
Feb 23 07:43:58 np0005626463.localdomain sudo[35677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:58 np0005626463.localdomain python3[35679]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771832638.0528462-66849-101500868619506/source _original_basename=tmp7f398_j3 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:43:58 np0005626463.localdomain sudo[35677]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:59 np0005626463.localdomain sudo[35707]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klqzmfqzlmaqmgkvlrxxqfrnrsjmapyw ; /usr/bin/python3
Feb 23 07:43:59 np0005626463.localdomain sudo[35707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:59 np0005626463.localdomain python3[35709]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:43:59 np0005626463.localdomain sudo[35707]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:59 np0005626463.localdomain sudo[35723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edqgiidvdfkshkjhwanxmozikyxgkrlk ; /usr/bin/python3
Feb 23 07:43:59 np0005626463.localdomain sudo[35723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:59 np0005626463.localdomain python3[35725]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:43:59 np0005626463.localdomain sudo[35723]: pam_unix(sudo:session): session closed for user root
Feb 23 07:43:59 np0005626463.localdomain sudo[35739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icyknnkkludmpegsrkxjtynffqxewilb ; /usr/bin/python3
Feb 23 07:43:59 np0005626463.localdomain sudo[35739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:43:59 np0005626463.localdomain python3[35741]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:43:59 np0005626463.localdomain sudo[35739]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:00 np0005626463.localdomain sudo[35755]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxfncwkwpojtvqmggqxoxwfclynhsypc ; /usr/bin/python3
Feb 23 07:44:00 np0005626463.localdomain sudo[35755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 07:44:00 np0005626463.localdomain python3[35757]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:44:00 np0005626463.localdomain sudo[35755]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:01 np0005626463.localdomain python3[35771]: ansible-ping Invoked with data=pong
Feb 23 07:44:03 np0005626463.localdomain sshd[35772]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:44:04 np0005626463.localdomain sshd[35772]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:44:12 np0005626463.localdomain sshd[35774]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:44:12 np0005626463.localdomain sshd[35774]: Accepted publickey for tripleo-admin from 192.168.122.100 port 52882 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 07:44:12 np0005626463.localdomain systemd-logind[759]: New session 28 of user tripleo-admin.
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Queued start job for default target Main User Target.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Created slice User Application Slice.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Reached target Paths.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Reached target Timers.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Starting D-Bus User Message Bus Socket...
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Starting Create User's Volatile Files and Directories...
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Finished Create User's Volatile Files and Directories.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Listening on D-Bus User Message Bus Socket.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Reached target Sockets.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Reached target Basic System.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Reached target Main User Target.
Feb 23 07:44:12 np0005626463.localdomain systemd[35778]: Startup finished in 119ms.
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 23 07:44:12 np0005626463.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Feb 23 07:44:12 np0005626463.localdomain sshd[35774]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 07:44:13 np0005626463.localdomain sudo[35837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drciwriyrtprweexrvgoeysoowkrkyrh ; /usr/bin/python3
Feb 23 07:44:13 np0005626463.localdomain sudo[35837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:13 np0005626463.localdomain python3[35839]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 07:44:13 np0005626463.localdomain sudo[35837]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:18 np0005626463.localdomain sudo[35857]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvxfnlidsrbgxntosrjfujwzjnvfnsje ; /usr/bin/python3
Feb 23 07:44:18 np0005626463.localdomain sudo[35857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:18 np0005626463.localdomain python3[35859]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Feb 23 07:44:18 np0005626463.localdomain sudo[35857]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:18 np0005626463.localdomain sudo[35873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ersdyztnsbtlskiodmonqijtwopbkvtz ; /usr/bin/python3
Feb 23 07:44:18 np0005626463.localdomain sudo[35873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:19 np0005626463.localdomain python3[35875]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 23 07:44:19 np0005626463.localdomain sudo[35873]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:19 np0005626463.localdomain sudo[35921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmgalrpzklvgczuziemqofethzcasfav ; /usr/bin/python3
Feb 23 07:44:19 np0005626463.localdomain sudo[35921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:19 np0005626463.localdomain python3[35923]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.2_n_ngzjtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:44:19 np0005626463.localdomain sudo[35921]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:19 np0005626463.localdomain sshd[35938]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:44:20 np0005626463.localdomain sudo[35953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epasatdthanhqpmtmshwnoztynptvmxv ; /usr/bin/python3
Feb 23 07:44:20 np0005626463.localdomain sudo[35953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:20 np0005626463.localdomain python3[35955]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.2_n_ngzjtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:44:20 np0005626463.localdomain sudo[35953]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:20 np0005626463.localdomain sshd[35938]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:44:21 np0005626463.localdomain sudo[35969]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqbcwytlzimwoobrgdldwufxqdzxuwyb ; /usr/bin/python3
Feb 23 07:44:21 np0005626463.localdomain sudo[35969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:21 np0005626463.localdomain python3[35971]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.2_n_ngzjtmphosts insertbefore=BOF block=172.17.0.106 np0005626463.localdomain np0005626463
                                                         172.18.0.106 np0005626463.storage.localdomain np0005626463.storage
                                                         172.20.0.106 np0005626463.storagemgmt.localdomain np0005626463.storagemgmt
                                                         172.17.0.106 np0005626463.internalapi.localdomain np0005626463.internalapi
                                                         172.19.0.106 np0005626463.tenant.localdomain np0005626463.tenant
                                                         192.168.122.106 np0005626463.ctlplane.localdomain np0005626463.ctlplane
                                                         172.17.0.107 np0005626465.localdomain np0005626465
                                                         172.18.0.107 np0005626465.storage.localdomain np0005626465.storage
                                                         172.20.0.107 np0005626465.storagemgmt.localdomain np0005626465.storagemgmt
                                                         172.17.0.107 np0005626465.internalapi.localdomain np0005626465.internalapi
                                                         172.19.0.107 np0005626465.tenant.localdomain np0005626465.tenant
                                                         192.168.122.107 np0005626465.ctlplane.localdomain np0005626465.ctlplane
                                                         172.17.0.108 np0005626466.localdomain np0005626466
                                                         172.18.0.108 np0005626466.storage.localdomain np0005626466.storage
                                                         172.20.0.108 np0005626466.storagemgmt.localdomain np0005626466.storagemgmt
                                                         172.17.0.108 np0005626466.internalapi.localdomain np0005626466.internalapi
                                                         172.19.0.108 np0005626466.tenant.localdomain np0005626466.tenant
                                                         192.168.122.108 np0005626466.ctlplane.localdomain np0005626466.ctlplane
                                                         172.17.0.103 np0005626459.localdomain np0005626459
                                                         172.18.0.103 np0005626459.storage.localdomain np0005626459.storage
                                                         172.20.0.103 np0005626459.storagemgmt.localdomain np0005626459.storagemgmt
                                                         172.17.0.103 np0005626459.internalapi.localdomain np0005626459.internalapi
                                                         172.19.0.103 np0005626459.tenant.localdomain np0005626459.tenant
                                                         192.168.122.103 np0005626459.ctlplane.localdomain np0005626459.ctlplane
                                                         172.17.0.104 np0005626460.localdomain np0005626460
                                                         172.18.0.104 np0005626460.storage.localdomain np0005626460.storage
                                                         172.20.0.104 np0005626460.storagemgmt.localdomain np0005626460.storagemgmt
                                                         172.17.0.104 np0005626460.internalapi.localdomain np0005626460.internalapi
                                                         172.19.0.104 np0005626460.tenant.localdomain np0005626460.tenant
                                                         192.168.122.104 np0005626460.ctlplane.localdomain np0005626460.ctlplane
                                                         172.17.0.105 np0005626461.localdomain np0005626461
                                                         172.18.0.105 np0005626461.storage.localdomain np0005626461.storage
                                                         172.20.0.105 np0005626461.storagemgmt.localdomain np0005626461.storagemgmt
                                                         172.17.0.105 np0005626461.internalapi.localdomain np0005626461.internalapi
                                                         172.19.0.105 np0005626461.tenant.localdomain np0005626461.tenant
                                                         192.168.122.105 np0005626461.ctlplane.localdomain np0005626461.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.134  overcloud.storage.localdomain
                                                         172.20.0.172  overcloud.storagemgmt.localdomain
                                                         172.17.0.129  overcloud.internalapi.localdomain
                                                         172.21.0.176  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:44:21 np0005626463.localdomain sudo[35969]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:21 np0005626463.localdomain sudo[35985]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxvrgcfxbfiiguofrffysicuvtmsojvp ; /usr/bin/python3
Feb 23 07:44:21 np0005626463.localdomain sudo[35985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:21 np0005626463.localdomain python3[35987]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.2_n_ngzjtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:44:21 np0005626463.localdomain sudo[35985]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:22 np0005626463.localdomain sudo[36002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayvkcsbfocugnxwrwrqbtpfbcpzgsphl ; /usr/bin/python3
Feb 23 07:44:22 np0005626463.localdomain sudo[36002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:22 np0005626463.localdomain python3[36004]: ansible-file Invoked with path=/tmp/ansible.2_n_ngzjtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:44:22 np0005626463.localdomain sudo[36002]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:23 np0005626463.localdomain sudo[36018]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diespaqiakzoxrilxycqrftufpwaknat ; /usr/bin/python3
Feb 23 07:44:23 np0005626463.localdomain sudo[36018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:23 np0005626463.localdomain python3[36020]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:44:23 np0005626463.localdomain sudo[36018]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:23 np0005626463.localdomain sudo[36035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrmaytjggxeqbdriqxsgpnpsovascbjl ; /usr/bin/python3
Feb 23 07:44:23 np0005626463.localdomain sudo[36035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:23 np0005626463.localdomain python3[36037]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:44:27 np0005626463.localdomain sudo[36035]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:28 np0005626463.localdomain sudo[36054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbozrmwfofzztxcharlphsvlqbjngdnu ; /usr/bin/python3
Feb 23 07:44:28 np0005626463.localdomain sudo[36054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:28 np0005626463.localdomain python3[36056]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:44:28 np0005626463.localdomain sudo[36054]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:28 np0005626463.localdomain sudo[36071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfnoaqojxfscrpziscbcxagthjrzssfg ; /usr/bin/python3
Feb 23 07:44:28 np0005626463.localdomain sudo[36071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:44:29 np0005626463.localdomain python3[36073]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:44:34 np0005626463.localdomain sudo[36095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:44:34 np0005626463.localdomain sudo[36095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:44:34 np0005626463.localdomain sudo[36095]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:34 np0005626463.localdomain sudo[36114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:44:34 np0005626463.localdomain sudo[36114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:44:35 np0005626463.localdomain sudo[36114]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:35 np0005626463.localdomain sudo[36213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:44:35 np0005626463.localdomain sudo[36213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:44:35 np0005626463.localdomain sudo[36213]: pam_unix(sudo:session): session closed for user root
Feb 23 07:44:41 np0005626463.localdomain groupadd[36318]: group added to /etc/group: name=puppet, GID=52
Feb 23 07:44:41 np0005626463.localdomain groupadd[36318]: group added to /etc/gshadow: name=puppet
Feb 23 07:44:41 np0005626463.localdomain groupadd[36318]: new group: name=puppet, GID=52
Feb 23 07:44:41 np0005626463.localdomain useradd[36325]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Feb 23 07:44:57 np0005626463.localdomain sshd[37119]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:44:59 np0005626463.localdomain sshd[37119]: Invalid user admin from 80.94.95.115 port 58260
Feb 23 07:44:59 np0005626463.localdomain sshd[37119]: Connection closed by invalid user admin 80.94.95.115 port 58260 [preauth]
Feb 23 07:45:02 np0005626463.localdomain sshd[37143]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:45:02 np0005626463.localdomain sshd[37149]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:45:03 np0005626463.localdomain sshd[37149]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:45:03 np0005626463.localdomain sshd[37143]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  Converting 2700 SID table entries...
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:45:35 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:45:35 np0005626463.localdomain sudo[37351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:45:35 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=5 res=1
Feb 23 07:45:35 np0005626463.localdomain sudo[37351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:45:35 np0005626463.localdomain sudo[37351]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:36 np0005626463.localdomain sudo[37368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:45:36 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 23 07:45:36 np0005626463.localdomain sudo[37368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:45:36 np0005626463.localdomain systemd-rc-local-generator[37461]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:45:36 np0005626463.localdomain systemd-sysv-generator[37465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:45:36 np0005626463.localdomain sudo[37368]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:45:36 np0005626463.localdomain systemd[1]: run-rbec30ba95a1043c08d2e4681aac6c730.service: Deactivated successfully.
Feb 23 07:45:37 np0005626463.localdomain sudo[37904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:45:37 np0005626463.localdomain sudo[37904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:45:37 np0005626463.localdomain sudo[37904]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:37 np0005626463.localdomain sudo[36071]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:39 np0005626463.localdomain sudo[37932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddbvppsffwfrddoleojlhsfszolhzhif ; /usr/bin/python3
Feb 23 07:45:39 np0005626463.localdomain sudo[37932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:39 np0005626463.localdomain python3[37934]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:39 np0005626463.localdomain sudo[37932]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:40 np0005626463.localdomain sudo[38071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejnrznprsepbivvxpihsghxftjsidiiw ; /usr/bin/python3
Feb 23 07:45:40 np0005626463.localdomain sudo[38071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:40 np0005626463.localdomain python3[38073]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:45:40 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:45:40 np0005626463.localdomain systemd-sysv-generator[38101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:45:40 np0005626463.localdomain systemd-rc-local-generator[38098]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:45:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:45:41 np0005626463.localdomain sudo[38071]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:42 np0005626463.localdomain sudo[38125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvrtuxgzgcuwwgqykzccpluipfqmkrol ; /usr/bin/python3
Feb 23 07:45:42 np0005626463.localdomain sudo[38125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:42 np0005626463.localdomain python3[38127]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:42 np0005626463.localdomain sudo[38125]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:42 np0005626463.localdomain sudo[38141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irydttqoinnimfyekcyypluyalleawrg ; /usr/bin/python3
Feb 23 07:45:42 np0005626463.localdomain sudo[38141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:42 np0005626463.localdomain python3[38143]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:42 np0005626463.localdomain sudo[38141]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:43 np0005626463.localdomain sudo[38158]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-freiwtniifvtrgwrjutrmscqzncnkykp ; /usr/bin/python3
Feb 23 07:45:43 np0005626463.localdomain sudo[38158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:43 np0005626463.localdomain python3[38160]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 07:45:43 np0005626463.localdomain sudo[38158]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:44 np0005626463.localdomain sudo[38176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxqyrltssiwsyfyfdgsyastdsmyqppwi ; /usr/bin/python3
Feb 23 07:45:44 np0005626463.localdomain sudo[38176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:44 np0005626463.localdomain python3[38178]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:44 np0005626463.localdomain sudo[38176]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:44 np0005626463.localdomain sudo[38194]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhlohzmiipugvsslwpackphntwhgwvfo ; /usr/bin/python3
Feb 23 07:45:44 np0005626463.localdomain sudo[38194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:44 np0005626463.localdomain python3[38196]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:44 np0005626463.localdomain sudo[38194]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:44 np0005626463.localdomain sudo[38212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbteukacocyxtdwvqsgtsfvderifhddg ; /usr/bin/python3
Feb 23 07:45:44 np0005626463.localdomain sudo[38212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:45 np0005626463.localdomain python3[38214]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:45:45 np0005626463.localdomain systemd[1]: Reloading Network Manager...
Feb 23 07:45:45 np0005626463.localdomain NetworkManager[5974]: <info>  [1771832745.1935] audit: op="reload" arg="0" pid=38217 uid=0 result="success"
Feb 23 07:45:45 np0005626463.localdomain NetworkManager[5974]: <info>  [1771832745.1943] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Feb 23 07:45:45 np0005626463.localdomain NetworkManager[5974]: <info>  [1771832745.1944] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Feb 23 07:45:45 np0005626463.localdomain systemd[1]: Reloaded Network Manager.
Feb 23 07:45:45 np0005626463.localdomain sudo[38212]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:45 np0005626463.localdomain sshd[38228]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:45:45 np0005626463.localdomain sudo[38232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxlbxoffativplpxnkboefpadrqiwpmm ; /usr/bin/python3
Feb 23 07:45:45 np0005626463.localdomain sudo[38232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:45 np0005626463.localdomain python3[38235]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:45 np0005626463.localdomain sudo[38232]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:45 np0005626463.localdomain sshd[38228]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:45:46 np0005626463.localdomain sudo[38250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qafbjcsenhyngulihoyhjhpajsyiqlka ; /usr/bin/python3
Feb 23 07:45:46 np0005626463.localdomain sudo[38250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:46 np0005626463.localdomain python3[38252]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:45:46 np0005626463.localdomain sudo[38250]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:46 np0005626463.localdomain sudo[38268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vickezodraljortirwrybzfcraqmylda ; /usr/bin/python3
Feb 23 07:45:46 np0005626463.localdomain sudo[38268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:46 np0005626463.localdomain python3[38270]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:45:46 np0005626463.localdomain sudo[38268]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:46 np0005626463.localdomain sudo[38284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rytsngealthtnebqnepomvpwwzycxusq ; /usr/bin/python3
Feb 23 07:45:46 np0005626463.localdomain sudo[38284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:46 np0005626463.localdomain python3[38286]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:46 np0005626463.localdomain sudo[38284]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:47 np0005626463.localdomain sudo[38300]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyevnwhpljtuurrovxxhmvovszznsuzh ; /usr/bin/python3
Feb 23 07:45:47 np0005626463.localdomain sudo[38300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:47 np0005626463.localdomain python3[38302]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 23 07:45:47 np0005626463.localdomain sudo[38300]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:48 np0005626463.localdomain sudo[38316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewrskvxowgajljtftgioccnhigmvjmuc ; /usr/bin/python3
Feb 23 07:45:48 np0005626463.localdomain sudo[38316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:48 np0005626463.localdomain python3[38318]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:45:48 np0005626463.localdomain sudo[38316]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:48 np0005626463.localdomain sudo[38332]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xulcrworbjevenhjxyifsbfgudjwziua ; /usr/bin/python3
Feb 23 07:45:48 np0005626463.localdomain sudo[38332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:49 np0005626463.localdomain python3[38334]: ansible-blockinfile Invoked with path=/tmp/ansible.vetg1p3y block=[192.168.122.106]*,[np0005626463.ctlplane.localdomain]*,[172.17.0.106]*,[np0005626463.internalapi.localdomain]*,[172.18.0.106]*,[np0005626463.storage.localdomain]*,[172.20.0.106]*,[np0005626463.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005626463.tenant.localdomain]*,[np0005626463.localdomain]*,[np0005626463]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/Caj4zYKd24ctvaRU1Hf9nT058OF4bRnDJ3bHimmkyIL7cccXAxo3lx50wZHWRYBhF5Wes6TmqnUTTK1h5wVdI8f7YtQ9IyMIlfoEiTThF5PgODVuRYq+YGjFIy7MTPyBnB2428aT4dlYqHSuxK2gL6ALlCJHNyeh3RW3jCOG89veDoRmbqHGoaD+xPRnfsdHLoLFNfxT4UJiKRuqsEd5fNtc392ROSa5XM3PPIs3YTypYmpfFHs1B1j+y6oZV8Ha/QXqURpI7/aJmfnDzXLMsLWp4GRpkwzljvNp87S5HL+kJMo79n0Vmh2JdN1orNP/4A2t/TENckHbrZCm+YmPqUqvpHkAZfFfmvP62YZTPq/qOjBMMq6ulGSHd2I4XfE7NNZRKoS3G4HVlBb0ONS13PaWx9rrJCRlF64L1dHSt9zpKrvRbWkSdXA0PwwehrU5/OBo1IY4WsRlWmPeET1/dFWiIr1t9uGjp5vmACAx7rnC6G5qSEhQ3/k1Wa57k/k=
                                                         [192.168.122.107]*,[np0005626465.ctlplane.localdomain]*,[172.17.0.107]*,[np0005626465.internalapi.localdomain]*,[172.18.0.107]*,[np0005626465.storage.localdomain]*,[172.20.0.107]*,[np0005626465.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005626465.tenant.localdomain]*,[np0005626465.localdomain]*,[np0005626465]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCUc8l2oYgfdO7xb3vN27co3Q/sFNU6Rw5wThiW1JMfeIzI90ZzS/L+BpsDsX8q2CW9QOHXrbUormpGsiNnix5j1P29Jc6e9A2mDlipXBrFSUiVZa8UOL03lFSz4nElapkASin2GCdHqy7//gGdQMKRP62VXpdhofb7i/N/gGoV5hSc8Q36KFDbWpvPkhD5H8nZtAfyxM99KwlC62D8jSN+gdoRtMRFPQTtyvyskyrgnXGC6xV71WTa6LJ6Meo7tfj4JlvDAWwlD+f9Ruu2ty2aHd2feVVKYvxZ4Z45iSfJnNxRFJvu1QOY0IU4Fj942leKwr6f0B5ogPFlTI7wRrAB1d9tri1WW2aL1AqYhdZscWi0VArYxLQr7BCVqz8KgFIzjbPoJ7uYnWcuDSiWlC1NJVO7Ij2natf8wZyvSyH+vydamkyoaNwxMnm4qs0/rvjwL49MdrHB79rXjHYJpt/JCBvn9a/rh5KqVH40P00DP35H71zyHPCSu1L20S/wY1k=
                                                         [192.168.122.108]*,[np0005626466.ctlplane.localdomain]*,[172.17.0.108]*,[np0005626466.internalapi.localdomain]*,[172.18.0.108]*,[np0005626466.storage.localdomain]*,[172.20.0.108]*,[np0005626466.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005626466.tenant.localdomain]*,[np0005626466.localdomain]*,[np0005626466]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4dg5LfbOyIHJudQjfDyIcqYXRqMUeYQIpjQPmNS0Tl7/EpBaYixjqlNovKIWOwkS4E2n4hwPLSTGSihYb5BeUDw32T80RumycS2tjBCSLiuq93xpTOaL2X+7wykkOSfY5xya13qrTg0ROJip0B6PSSF+Rn28SAKLh91euCdRaxWTAMeOSTP9WeCA3d0gsgb4xSMMWZxR4o1BU2bixjAcJHAlKYDc1OGpKkirRoziu9Y4nq2lmbwTg5HiS8STVkqyGHba9k6IC0eF2ZmT6M2thoHatYVtjuUeEE9bSvaAFB8oSI9Np6+OaluvuoKJYjRA3dzEQOi4ft/wwUrJfvyypDAxKBkxo7lCWIDEBK5Zb9BVoo68psz2IVPNGNZJtKXiq58CAqZTR02l/wEq4wB1/hp7ZW+ZMnHQUq1FpGITIA89KZeL9xNlnHqYak58B2GCYgK6OdvWktr4WHN8nbEmwZvaTrijZvnww7h2FQG4BMcSlO6AWKAdjksJZlVDYLJs=
                                                         [192.168.122.103]*,[np0005626459.ctlplane.localdomain]*,[172.17.0.103]*,[np0005626459.internalapi.localdomain]*,[172.18.0.103]*,[np0005626459.storage.localdomain]*,[172.20.0.103]*,[np0005626459.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005626459.tenant.localdomain]*,[np0005626459.localdomain]*,[np0005626459]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9VsrIfV6Z4AiMtHfmjOpcBCt5sMsGmP0fOSak1UBP4r9lW4eYyoJY7Rtt1LDAcbGqdL3Nh3yc8ub0ekpXF6MA0vKucLb+jtjexv6t21W2grJ+ucwsvDhTDhDXmOUwD5G7A9Zj2WDqt/DN4DxeEqvQ6v1dSQaG+17BVPvM7mhgd5CSYOdUphCC81TPZgj3xyK31Q89biIS6pCBSKnsyN7qcU38bFGvRN0sTFaFt9KrIUfJJdcAZudw5Q/R775pmaaeHTSVPL05gE7dyz8RicEpenh6X0aZCOVt0+4VBnfXXSIL9QIwjrarPPKRdtmQY7dZ3dVNI1ZWA5YOl0y6R3fmxaRV5y1ZkDW6vG0463hYjKaAVqILAAPZGzhuzL7/1zxIv0guUB58tOUrCkkPIRzd6NQLL2j8L7RLIj3bZjG2xf0WiierxPsCEhl3wmdIVRUReE6jYalNGlscGUr1JWproKoaQqfck0OWhGy7jCCe8Gd8a/pr7jtg+X3bEMQ3HAc=
                                                         [192.168.122.104]*,[np0005626460.ctlplane.localdomain]*,[172.17.0.104]*,[np0005626460.internalapi.localdomain]*,[172.18.0.104]*,[np0005626460.storage.localdomain]*,[172.20.0.104]*,[np0005626460.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005626460.tenant.localdomain]*,[np0005626460.localdomain]*,[np0005626460]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeQmwl5IUCA7h6xphf+o3WARi0Xlj+0K08ltN/FCX7iF0EALCfDqtKOHz7wv5gS04Zx4aeNfcVHv9bHLRJxTPzliSNVutqA7vdFa0R/kRMdNzkqSOCuJ64sQ8GwSOHSrcFy7qC87BuP6xB9atSBjpAEB4NZOuXbvmSN/dCa/nNpUWoWNNg3eR5AalrExCptFYZ4E7YWvJ6HdZpr1QhcAJW0V1y4+u4FfzxHT2SQfGmua4TFHH1lUMiMrgAoELLe+pYdnWooEhRlkPulWy/wOyNz7aCCDP462XBhCc0CmiBDRwMBaJISck1pJCOIksvu8TYa6Fp8aayZqJvbUJYl5C1Z/o+zgHMTjeec0Th5GIuw9XUJkkx8TT5Fh7aWJvX9BbHlMaJjAqc+G/wiIImvKlsuIsovU6TH0P/XiysoWXeUWM7JqR8Y/05+yELy+xAMKT7PfEXE1fWOlGcCJsarLYGhh/7Jypwfh8Y/wOtYdKOGODxDnzq2f2VySsEiAf0EL0=
                                                         [192.168.122.105]*,[np0005626461.ctlplane.localdomain]*,[172.17.0.105]*,[np0005626461.internalapi.localdomain]*,[172.18.0.105]*,[np0005626461.storage.localdomain]*,[172.20.0.105]*,[np0005626461.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005626461.tenant.localdomain]*,[np0005626461.localdomain]*,[np0005626461]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDBCzU24t9gA5R+exm4rHJ2VytHuq8uUoKuu6SZ07dskKR77n7TwlsZhsDjpzwsddHd+lvsfvOVmolxjJsCmq7LJRMGA/mczHXsGGb43YPZPKsiJ6KMPDORy5/ihhnqixBYVmBGtdPu/Hh/udGnymZgR/RYGltDDHoCfGGiEcHJSIuf/Bv2Uv4xFnxFjDrWQFrkJ5Grq1xC7cGXgC3gAiTCjGHkG9rb/oyTUjjM8LaaRYIjeoDQZu1/8y5pl6cnhW21VTA+u55SkSimb/g5oOuSmrv899iHFwb54uLINXvA4aTtduUnxNQBVRyFvWa3yCZXVJeYlcVP8Q9tljn9anN1aISnS311Jmay6zUY927bxnzrpkwaV7Ggwtvi6vlVy84ZvOJ/IJ2boDiMujh1ZpT3bxXG3Oy0EjfBVbpkS6r2MbGTPj/xWnosJ6JNVbb9LW7Ftfi3/NFfAb7PpTgY036DA8LYoYIfqxVJUhlo5fJjqqOLa/zbvZVwrFCG+Zm160=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:49 np0005626463.localdomain sudo[38332]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:49 np0005626463.localdomain sudo[38348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efbelvomvahyglzmfikmjjuolnapxbwk ; /usr/bin/python3
Feb 23 07:45:49 np0005626463.localdomain sudo[38348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:49 np0005626463.localdomain python3[38350]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.vetg1p3y' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:49 np0005626463.localdomain sudo[38348]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:49 np0005626463.localdomain sudo[38366]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcokjlpjmkeffckmflcuuxbyfqzufuen ; /usr/bin/python3
Feb 23 07:45:49 np0005626463.localdomain sudo[38366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:49 np0005626463.localdomain python3[38368]: ansible-file Invoked with path=/tmp/ansible.vetg1p3y state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:45:49 np0005626463.localdomain sudo[38366]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:50 np0005626463.localdomain sudo[38382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pemhteseezzghyrvkqpcalczdbfunkpa ; /usr/bin/python3
Feb 23 07:45:50 np0005626463.localdomain sudo[38382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:50 np0005626463.localdomain python3[38384]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:45:50 np0005626463.localdomain sudo[38382]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:50 np0005626463.localdomain sudo[38398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bexsvgvedyaioucdpbplhekpigvtrhbs ; /usr/bin/python3
Feb 23 07:45:50 np0005626463.localdomain sudo[38398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:50 np0005626463.localdomain python3[38400]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:50 np0005626463.localdomain sudo[38398]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:51 np0005626463.localdomain sudo[38416]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ittbbsldrauxhxfkbkfbaskerqnkzlew ; /usr/bin/python3
Feb 23 07:45:51 np0005626463.localdomain sudo[38416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:51 np0005626463.localdomain python3[38418]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:51 np0005626463.localdomain sudo[38416]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:51 np0005626463.localdomain sudo[38435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaxsofykugdrfztnusqoitlczuhxzvhj ; /usr/bin/python3
Feb 23 07:45:51 np0005626463.localdomain sudo[38435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:51 np0005626463.localdomain python3[38437]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Feb 23 07:45:51 np0005626463.localdomain sudo[38435]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:51 np0005626463.localdomain sudo[38451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-binogkzpfftjatlcelrobgbkqfxqwuvn ; /usr/bin/python3
Feb 23 07:45:51 np0005626463.localdomain sudo[38451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:52 np0005626463.localdomain sudo[38451]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:52 np0005626463.localdomain sudo[38499]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbethwfnfqebcjvixutvgezfiteseqeb ; /usr/bin/python3
Feb 23 07:45:52 np0005626463.localdomain sudo[38499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:52 np0005626463.localdomain sudo[38499]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:52 np0005626463.localdomain sudo[38542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzjetppjonnypvztyidlifrcfnnecdsf ; /usr/bin/python3
Feb 23 07:45:52 np0005626463.localdomain sudo[38542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:52 np0005626463.localdomain sudo[38542]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:54 np0005626463.localdomain sudo[38572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqrvvdziphbbadtpmvhycqosozsmddjn ; /usr/bin/python3
Feb 23 07:45:54 np0005626463.localdomain sudo[38572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:54 np0005626463.localdomain python3[38574]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:45:54 np0005626463.localdomain sudo[38572]: pam_unix(sudo:session): session closed for user root
Feb 23 07:45:54 np0005626463.localdomain sudo[38589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-funaalblnenabpugnvhubmccjlkxudyw ; /usr/bin/python3
Feb 23 07:45:54 np0005626463.localdomain sudo[38589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:45:54 np0005626463.localdomain python3[38591]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:45:57 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:45:57 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:45:58 np0005626463.localdomain systemd-rc-local-generator[38680]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:45:58 np0005626463.localdomain systemd-sysv-generator[38684]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: tuned.service: Consumed 1.572s CPU time.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:45:58 np0005626463.localdomain systemd[1]: run-r3c66311e429d44ecb657594e1d396238.service: Deactivated successfully.
Feb 23 07:45:59 np0005626463.localdomain sshd[39003]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:45:59 np0005626463.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 07:45:59 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:45:59 np0005626463.localdomain sshd[39003]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:45:59 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:46:00 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:46:00 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:46:00 np0005626463.localdomain systemd[1]: run-rb6d2207b3fb54b0dbea8f6d4ee22507c.service: Deactivated successfully.
Feb 23 07:46:00 np0005626463.localdomain sudo[38589]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:00 np0005626463.localdomain sudo[39027]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rphdirkduebdtjgrgsznuytwklkgnixf ; /usr/bin/python3
Feb 23 07:46:00 np0005626463.localdomain sudo[39027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:01 np0005626463.localdomain python3[39029]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:46:01 np0005626463.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 23 07:46:01 np0005626463.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 23 07:46:01 np0005626463.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 23 07:46:01 np0005626463.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 07:46:02 np0005626463.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 07:46:02 np0005626463.localdomain sudo[39027]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:02 np0005626463.localdomain sudo[39222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkojkyaybvpxuqyjrhmgryxumvqyiagv ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 23 07:46:02 np0005626463.localdomain sudo[39222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:03 np0005626463.localdomain python3[39224]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:03 np0005626463.localdomain sudo[39222]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:03 np0005626463.localdomain sudo[39239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpzporcpldqfcrnbqsfsdpqbuwrjqocu ; /usr/bin/python3
Feb 23 07:46:03 np0005626463.localdomain sudo[39239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:03 np0005626463.localdomain python3[39241]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 23 07:46:03 np0005626463.localdomain sudo[39239]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:04 np0005626463.localdomain sudo[39255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufezuuhjetxscidxxurdvhtzwazkzeqj ; /usr/bin/python3
Feb 23 07:46:04 np0005626463.localdomain sudo[39255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:04 np0005626463.localdomain python3[39257]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:46:04 np0005626463.localdomain sudo[39255]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:04 np0005626463.localdomain sudo[39271]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjkdgvwlxitwqtilkxldbchzqdzomffw ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 23 07:46:04 np0005626463.localdomain sudo[39271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:04 np0005626463.localdomain python3[39273]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:06 np0005626463.localdomain sudo[39271]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:06 np0005626463.localdomain sudo[39291]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epqefveemkxtlwqwegqnhpikwsknybrm ; /usr/bin/python3
Feb 23 07:46:06 np0005626463.localdomain sudo[39291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:06 np0005626463.localdomain python3[39293]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:06 np0005626463.localdomain sudo[39291]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:07 np0005626463.localdomain sudo[39308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlaunztwfnshzihvecrzigkfvfestxqf ; /usr/bin/python3
Feb 23 07:46:07 np0005626463.localdomain sudo[39308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:07 np0005626463.localdomain python3[39310]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:46:07 np0005626463.localdomain sudo[39308]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:09 np0005626463.localdomain sudo[39324]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aokuzwncjejhjmaiunrojtefdlzsiudk ; /usr/bin/python3
Feb 23 07:46:09 np0005626463.localdomain sudo[39324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:09 np0005626463.localdomain python3[39326]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:09 np0005626463.localdomain sudo[39324]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:14 np0005626463.localdomain sudo[39340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-getucxizomjtvdelulqrwknesuqtxabj ; /usr/bin/python3
Feb 23 07:46:14 np0005626463.localdomain sudo[39340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:15 np0005626463.localdomain python3[39342]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:15 np0005626463.localdomain sudo[39340]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:15 np0005626463.localdomain sudo[39388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poctmpjkaeyglvwziitaxochiplmmzek ; /usr/bin/python3
Feb 23 07:46:15 np0005626463.localdomain sudo[39388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:15 np0005626463.localdomain python3[39390]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:15 np0005626463.localdomain sudo[39388]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:15 np0005626463.localdomain sudo[39433]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcyixuctjdjniomwqxcdafuhhhrdwyge ; /usr/bin/python3
Feb 23 07:46:15 np0005626463.localdomain sudo[39433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:16 np0005626463.localdomain python3[39435]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832775.2933772-71384-262129182644031/source _original_basename=tmpbu91ngux follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:16 np0005626463.localdomain sudo[39433]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:16 np0005626463.localdomain sudo[39463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emdgwpjsbcohzekrrbdrhaupdpyyntrg ; /usr/bin/python3
Feb 23 07:46:16 np0005626463.localdomain sudo[39463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:16 np0005626463.localdomain python3[39465]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:16 np0005626463.localdomain sudo[39463]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:17 np0005626463.localdomain sudo[39511]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvfdblxcvgbtywtphmqlcyaeynswvyld ; /usr/bin/python3
Feb 23 07:46:17 np0005626463.localdomain sudo[39511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:17 np0005626463.localdomain python3[39513]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:17 np0005626463.localdomain sudo[39511]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:17 np0005626463.localdomain sudo[39554]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgpyybgproqylxxokbrdditbnobkgnbk ; /usr/bin/python3
Feb 23 07:46:17 np0005626463.localdomain sudo[39554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:17 np0005626463.localdomain python3[39556]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832776.945125-71480-188798672625667/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=cb4e2d65c3f4c3faf38650c4c339d73dfcec347e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:17 np0005626463.localdomain sudo[39554]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:18 np0005626463.localdomain sudo[39616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tulsbdpmhpoahsbwdwhdypydhctzhyho ; /usr/bin/python3
Feb 23 07:46:18 np0005626463.localdomain sudo[39616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:18 np0005626463.localdomain python3[39618]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:18 np0005626463.localdomain sudo[39616]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:18 np0005626463.localdomain sudo[39659]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvpqzvbtbmfopgenhbxbglxbqokzmjkl ; /usr/bin/python3
Feb 23 07:46:18 np0005626463.localdomain sudo[39659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:18 np0005626463.localdomain python3[39661]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832777.899731-71543-22303193468700/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=e3816c2e211db94b1efb9354b78e4bda87216798 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:18 np0005626463.localdomain sudo[39659]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:19 np0005626463.localdomain sudo[39721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxrbmbsesituncewozzeiabounygvhqn ; /usr/bin/python3
Feb 23 07:46:19 np0005626463.localdomain sudo[39721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:19 np0005626463.localdomain python3[39723]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:19 np0005626463.localdomain sudo[39721]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:19 np0005626463.localdomain sudo[39764]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpdkwirdxgixdwomzbjevvxacpdcgqen ; /usr/bin/python3
Feb 23 07:46:19 np0005626463.localdomain sudo[39764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:19 np0005626463.localdomain python3[39766]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832778.780203-71543-268509960303819/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=426c74ff16c690bcb458d5adf7a90df54cf7398a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:19 np0005626463.localdomain sudo[39764]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:20 np0005626463.localdomain sudo[39826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxomvluqdvfqraccnpxkemcclyvdpblh ; /usr/bin/python3
Feb 23 07:46:20 np0005626463.localdomain sudo[39826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:20 np0005626463.localdomain python3[39828]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:20 np0005626463.localdomain sudo[39826]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:20 np0005626463.localdomain sudo[39869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-levcnwhvevkalqdwnsrfygwohlxjwazo ; /usr/bin/python3
Feb 23 07:46:20 np0005626463.localdomain sudo[39869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:20 np0005626463.localdomain python3[39871]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832779.7350893-71543-17093954431605/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:20 np0005626463.localdomain sudo[39869]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:20 np0005626463.localdomain sudo[39931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziqbfxwdxojtqymoqmfomomsowsuztca ; /usr/bin/python3
Feb 23 07:46:20 np0005626463.localdomain sudo[39931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:21 np0005626463.localdomain python3[39933]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:21 np0005626463.localdomain sudo[39931]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:21 np0005626463.localdomain sudo[39974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqcimktbjzfggdvgsfmnbawweshkqocd ; /usr/bin/python3
Feb 23 07:46:21 np0005626463.localdomain sudo[39974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:21 np0005626463.localdomain python3[39976]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832780.8098772-71543-242073356606177/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:21 np0005626463.localdomain sudo[39974]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:21 np0005626463.localdomain sudo[40036]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcxtaeqmcwelomagvkfpwrjseagveego ; /usr/bin/python3
Feb 23 07:46:21 np0005626463.localdomain sudo[40036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:22 np0005626463.localdomain python3[40038]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:22 np0005626463.localdomain sudo[40036]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:22 np0005626463.localdomain sudo[40079]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaxifrumcxjyrnyxkkjghulogpfsbdlj ; /usr/bin/python3
Feb 23 07:46:22 np0005626463.localdomain sudo[40079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:22 np0005626463.localdomain python3[40081]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832781.6953962-71543-233202908316202/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=7b67a93cb6155d994227cb0fb8cb85d0abcca135 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:22 np0005626463.localdomain sudo[40079]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:22 np0005626463.localdomain sudo[40141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voowthbekqalxbhtpaspxnimssfsgvjk ; /usr/bin/python3
Feb 23 07:46:22 np0005626463.localdomain sudo[40141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:22 np0005626463.localdomain python3[40143]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:22 np0005626463.localdomain sudo[40141]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:23 np0005626463.localdomain sudo[40184]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xprzvlbxmnaywkvyczghhelrvquecpmq ; /usr/bin/python3
Feb 23 07:46:23 np0005626463.localdomain sudo[40184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:23 np0005626463.localdomain python3[40186]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832782.560317-71543-240669836635481/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:23 np0005626463.localdomain sudo[40184]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:23 np0005626463.localdomain sudo[40246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xncmcocxmzynarlmwvsgkkmvzzwcqyrx ; /usr/bin/python3
Feb 23 07:46:23 np0005626463.localdomain sudo[40246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:23 np0005626463.localdomain python3[40248]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:23 np0005626463.localdomain sudo[40246]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:24 np0005626463.localdomain sudo[40289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqsfitvxxfcbbgvaauflygnukbfgaqzp ; /usr/bin/python3
Feb 23 07:46:24 np0005626463.localdomain sudo[40289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:24 np0005626463.localdomain python3[40291]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832783.4746933-71543-180116458338279/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=4e4e677ff4d1886f9c2ad18567185be59ce1ed84 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:24 np0005626463.localdomain sudo[40289]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:24 np0005626463.localdomain sudo[40351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfonrrbebnugydinhqgfgfltfmtipwmw ; /usr/bin/python3
Feb 23 07:46:24 np0005626463.localdomain sudo[40351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:24 np0005626463.localdomain python3[40353]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:24 np0005626463.localdomain sudo[40351]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:24 np0005626463.localdomain sudo[40394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iepmnhfakboqgkyjherolpfsankmmkmm ; /usr/bin/python3
Feb 23 07:46:24 np0005626463.localdomain sudo[40394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:25 np0005626463.localdomain python3[40396]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832784.3707433-71543-210944442131248/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:25 np0005626463.localdomain sudo[40394]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:25 np0005626463.localdomain sudo[40456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kammygptamejhybgsltwtunzsuwkglaw ; /usr/bin/python3
Feb 23 07:46:25 np0005626463.localdomain sudo[40456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:25 np0005626463.localdomain python3[40458]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:25 np0005626463.localdomain sudo[40456]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:25 np0005626463.localdomain sshd[40472]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:46:25 np0005626463.localdomain sudo[40501]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuyaqjgwbiqnznoqfxiwyotvcwdnmzym ; /usr/bin/python3
Feb 23 07:46:25 np0005626463.localdomain sudo[40501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:25 np0005626463.localdomain python3[40503]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832785.2519917-71543-76207560337378/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:25 np0005626463.localdomain sudo[40501]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:26 np0005626463.localdomain sshd[40472]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:46:26 np0005626463.localdomain sudo[40563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxgpacmcolaitqcpvruwthoimloumuva ; /usr/bin/python3
Feb 23 07:46:26 np0005626463.localdomain sudo[40563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:26 np0005626463.localdomain python3[40565]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:26 np0005626463.localdomain sudo[40563]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:26 np0005626463.localdomain sudo[40606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfuygwyqhrjlqcublhpvkaazldnigbjy ; /usr/bin/python3
Feb 23 07:46:26 np0005626463.localdomain sudo[40606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:26 np0005626463.localdomain python3[40608]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832786.142713-71543-22452970220595/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=6914e83d930180efd1febf3d10b0106910a745c1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:26 np0005626463.localdomain sudo[40606]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:27 np0005626463.localdomain sudo[40636]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgzeycarxymutkurlmhadnwbarbewfxe ; /usr/bin/python3
Feb 23 07:46:27 np0005626463.localdomain sudo[40636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:27 np0005626463.localdomain python3[40638]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:46:27 np0005626463.localdomain sudo[40636]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:28 np0005626463.localdomain sudo[40684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpipqfinobusgqdrpzpgxtrqaszkkehb ; /usr/bin/python3
Feb 23 07:46:28 np0005626463.localdomain sudo[40684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:28 np0005626463.localdomain python3[40686]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:46:28 np0005626463.localdomain sudo[40684]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:28 np0005626463.localdomain sudo[40727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imdbxnecsznoaxiohjhqlzmcbxgsezpp ; /usr/bin/python3
Feb 23 07:46:28 np0005626463.localdomain sudo[40727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:28 np0005626463.localdomain python3[40729]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832788.02342-72366-192368920528160/source _original_basename=tmp0wffb24f follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:46:28 np0005626463.localdomain sudo[40727]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:32 np0005626463.localdomain sudo[40757]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwihiomxqqkjolfgsujmbzodqhoigaew ; /usr/bin/python3
Feb 23 07:46:32 np0005626463.localdomain sudo[40757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:33 np0005626463.localdomain python3[40759]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 07:46:33 np0005626463.localdomain sudo[40757]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:33 np0005626463.localdomain sudo[40818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfftiyvdihnamihumnabznidckanbzjp ; /usr/bin/python3
Feb 23 07:46:33 np0005626463.localdomain sudo[40818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:33 np0005626463.localdomain python3[40820]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:37 np0005626463.localdomain sudo[40822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:46:37 np0005626463.localdomain sudo[40822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:46:37 np0005626463.localdomain sudo[40822]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:37 np0005626463.localdomain sudo[40837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:46:37 np0005626463.localdomain sudo[40837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:46:37 np0005626463.localdomain sudo[40818]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:38 np0005626463.localdomain sudo[40886]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdjeamstaryiuswbwigolcpksnjiscxt ; /usr/bin/python3
Feb 23 07:46:38 np0005626463.localdomain sudo[40886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:38 np0005626463.localdomain sudo[40837]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:38 np0005626463.localdomain python3[40889]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:38 np0005626463.localdomain sudo[40902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:46:38 np0005626463.localdomain sudo[40902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:46:38 np0005626463.localdomain sudo[40902]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:42 np0005626463.localdomain sudo[40886]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:43 np0005626463.localdomain sudo[40930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yohfkpxsztjdunlnvykiwuzvvlbjubeu ; /usr/bin/python3
Feb 23 07:46:43 np0005626463.localdomain sudo[40930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:43 np0005626463.localdomain python3[40932]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:43 np0005626463.localdomain sudo[40930]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:44 np0005626463.localdomain sudo[40953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqthtvzysyzaxhdmqvjviupfhfibxwyd ; /usr/bin/python3
Feb 23 07:46:44 np0005626463.localdomain sudo[40953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:44 np0005626463.localdomain python3[40955]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:44 np0005626463.localdomain sudo[40953]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:44 np0005626463.localdomain sudo[40976]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bofqpektqdleaawtkajsyinuvkhefjei ; /usr/bin/python3
Feb 23 07:46:44 np0005626463.localdomain sudo[40976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:44 np0005626463.localdomain python3[40978]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:44 np0005626463.localdomain sudo[40976]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:45 np0005626463.localdomain sudo[40999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyeiesmgxlberrldlufxoovczyuuwctr ; /usr/bin/python3
Feb 23 07:46:45 np0005626463.localdomain sudo[40999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:45 np0005626463.localdomain python3[41001]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:45 np0005626463.localdomain sudo[40999]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:45 np0005626463.localdomain sudo[41022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arwggghpkvkvjmibrhbdskptqykhqgtt ; /usr/bin/python3
Feb 23 07:46:45 np0005626463.localdomain sudo[41022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:46:46 np0005626463.localdomain python3[41024]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:46:46 np0005626463.localdomain sudo[41022]: pam_unix(sudo:session): session closed for user root
Feb 23 07:46:47 np0005626463.localdomain systemd[35778]: Starting Mark boot as successful...
Feb 23 07:46:47 np0005626463.localdomain systemd[35778]: Finished Mark boot as successful.
Feb 23 07:46:52 np0005626463.localdomain sshd[41033]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:46:54 np0005626463.localdomain sshd[41033]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:47:08 np0005626463.localdomain sshd[41035]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:47:09 np0005626463.localdomain sshd[41035]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:47:27 np0005626463.localdomain sudo[41050]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynuakenyrjwuokhrkrxpohqlbevuoxzo ; /usr/bin/python3
Feb 23 07:47:27 np0005626463.localdomain sudo[41050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:27 np0005626463.localdomain python3[41052]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:27 np0005626463.localdomain sudo[41050]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:28 np0005626463.localdomain sudo[41098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnzvddgcvttchdwphtmkqsbxxolnzmeh ; /usr/bin/python3
Feb 23 07:47:28 np0005626463.localdomain sudo[41098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:28 np0005626463.localdomain python3[41100]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:28 np0005626463.localdomain sudo[41098]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:28 np0005626463.localdomain sudo[41116]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwvyiqqvvbngrhktsjsakplrlgselivq ; /usr/bin/python3
Feb 23 07:47:28 np0005626463.localdomain sudo[41116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:28 np0005626463.localdomain python3[41118]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpvgyu3l0p recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:28 np0005626463.localdomain sudo[41116]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:28 np0005626463.localdomain sudo[41146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqejlfqkfhfaudluttajqzuknfwbsrhn ; /usr/bin/python3
Feb 23 07:47:28 np0005626463.localdomain sudo[41146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:28 np0005626463.localdomain python3[41148]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:28 np0005626463.localdomain sudo[41146]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:29 np0005626463.localdomain sudo[41194]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtxdwmybcuzkergwxjgvfhgvrpsonmwz ; /usr/bin/python3
Feb 23 07:47:29 np0005626463.localdomain sudo[41194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:29 np0005626463.localdomain python3[41196]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:29 np0005626463.localdomain sudo[41194]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:29 np0005626463.localdomain sudo[41212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijuzxsmanmmduaunbbrdvdxjjlqlfubz ; /usr/bin/python3
Feb 23 07:47:29 np0005626463.localdomain sudo[41212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:29 np0005626463.localdomain python3[41214]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:29 np0005626463.localdomain sudo[41212]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:30 np0005626463.localdomain sudo[41274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yctpgbvkxwmqpyacbjwpsncqrovhcrrq ; /usr/bin/python3
Feb 23 07:47:30 np0005626463.localdomain sudo[41274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:30 np0005626463.localdomain python3[41276]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:30 np0005626463.localdomain sudo[41274]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:30 np0005626463.localdomain sudo[41292]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpfaohksasuhpotcutildzacinvfolnj ; /usr/bin/python3
Feb 23 07:47:30 np0005626463.localdomain sudo[41292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:30 np0005626463.localdomain python3[41294]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:30 np0005626463.localdomain sudo[41292]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:31 np0005626463.localdomain sudo[41354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgamrujqmklqupybjukwnbepxizwedpp ; /usr/bin/python3
Feb 23 07:47:31 np0005626463.localdomain sudo[41354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:31 np0005626463.localdomain python3[41356]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:31 np0005626463.localdomain sudo[41354]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:31 np0005626463.localdomain sudo[41372]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuatevuhhuoiybzzbsxlfpeazbgdttip ; /usr/bin/python3
Feb 23 07:47:31 np0005626463.localdomain sudo[41372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:31 np0005626463.localdomain python3[41374]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:31 np0005626463.localdomain sudo[41372]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:32 np0005626463.localdomain sudo[41434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvubexuqfvlfotakxvzcnwyjyskfnwfw ; /usr/bin/python3
Feb 23 07:47:32 np0005626463.localdomain sudo[41434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:32 np0005626463.localdomain python3[41436]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:32 np0005626463.localdomain sudo[41434]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:32 np0005626463.localdomain sudo[41452]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwfmwqvawmbvinykhtrwhefbsslqjlty ; /usr/bin/python3
Feb 23 07:47:32 np0005626463.localdomain sudo[41452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:32 np0005626463.localdomain python3[41454]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:32 np0005626463.localdomain sudo[41452]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:32 np0005626463.localdomain sudo[41514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyolkwibtgzzhxlonigzwrpcolvmxyck ; /usr/bin/python3
Feb 23 07:47:32 np0005626463.localdomain sudo[41514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:33 np0005626463.localdomain python3[41516]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:33 np0005626463.localdomain sudo[41514]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:33 np0005626463.localdomain sudo[41532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gorwnvixlzbjzprfgrqepuiczkguzzzw ; /usr/bin/python3
Feb 23 07:47:33 np0005626463.localdomain sudo[41532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:33 np0005626463.localdomain python3[41534]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:33 np0005626463.localdomain sudo[41532]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:33 np0005626463.localdomain sudo[41594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arfotyzyytcydpwnzymzhmhjtthasyyn ; /usr/bin/python3
Feb 23 07:47:33 np0005626463.localdomain sudo[41594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:33 np0005626463.localdomain python3[41596]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:33 np0005626463.localdomain sudo[41594]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:33 np0005626463.localdomain sudo[41612]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teofmqdlstprpitoitnqcjmgkueazwxf ; /usr/bin/python3
Feb 23 07:47:33 np0005626463.localdomain sudo[41612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:34 np0005626463.localdomain python3[41614]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:34 np0005626463.localdomain sudo[41612]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:34 np0005626463.localdomain sudo[41674]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oujzqhnrwjhdduqtlreysoixuydosgqk ; /usr/bin/python3
Feb 23 07:47:34 np0005626463.localdomain sudo[41674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:34 np0005626463.localdomain python3[41676]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:34 np0005626463.localdomain sudo[41674]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:34 np0005626463.localdomain sudo[41692]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pofrxivjbgfthaqtpqsfguncrmwpvsdh ; /usr/bin/python3
Feb 23 07:47:34 np0005626463.localdomain sudo[41692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:34 np0005626463.localdomain python3[41694]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:34 np0005626463.localdomain sudo[41692]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:35 np0005626463.localdomain sudo[41754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhccdgxpuehacyrenzzzajcmzfhbfsnm ; /usr/bin/python3
Feb 23 07:47:35 np0005626463.localdomain sudo[41754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:35 np0005626463.localdomain python3[41756]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:35 np0005626463.localdomain sudo[41754]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:35 np0005626463.localdomain sudo[41772]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgfruglypntokctmjpsppwbzpymvwgsr ; /usr/bin/python3
Feb 23 07:47:35 np0005626463.localdomain sudo[41772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:35 np0005626463.localdomain python3[41774]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:35 np0005626463.localdomain sudo[41772]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:35 np0005626463.localdomain sudo[41834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngmgkkpepcwmbalcnjxpzetijhavtoyb ; /usr/bin/python3
Feb 23 07:47:35 np0005626463.localdomain sudo[41834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:35 np0005626463.localdomain python3[41836]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:35 np0005626463.localdomain sudo[41834]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:36 np0005626463.localdomain sudo[41852]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdrsysepaznfdzuedqolzhwjocyigndn ; /usr/bin/python3
Feb 23 07:47:36 np0005626463.localdomain sudo[41852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:36 np0005626463.localdomain python3[41854]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:36 np0005626463.localdomain sudo[41852]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:36 np0005626463.localdomain sudo[41914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yexprukekcogqgrpufrxsenhjvtirhcj ; /usr/bin/python3
Feb 23 07:47:36 np0005626463.localdomain sudo[41914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:37 np0005626463.localdomain python3[41916]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:37 np0005626463.localdomain sudo[41914]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:37 np0005626463.localdomain sudo[41932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqrzsamczwlajvvhpzvkpjrtgllfwhgn ; /usr/bin/python3
Feb 23 07:47:37 np0005626463.localdomain sudo[41932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:37 np0005626463.localdomain python3[41934]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:37 np0005626463.localdomain sudo[41932]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:37 np0005626463.localdomain sudo[41994]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwfcqkbqoajiepngnoyjkghvoqosuqex ; /usr/bin/python3
Feb 23 07:47:37 np0005626463.localdomain sudo[41994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:37 np0005626463.localdomain python3[41996]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:37 np0005626463.localdomain sudo[41994]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:37 np0005626463.localdomain sudo[42012]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebguydgbpchugzoukqjdmtdbtjeahaqv ; /usr/bin/python3
Feb 23 07:47:37 np0005626463.localdomain sudo[42012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:38 np0005626463.localdomain python3[42014]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:38 np0005626463.localdomain sudo[42012]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:38 np0005626463.localdomain sudo[42042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtxmcmhfnszhnnxvtbffkmuxusppsosi ; /usr/bin/python3
Feb 23 07:47:38 np0005626463.localdomain sudo[42042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:38 np0005626463.localdomain python3[42044]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:47:38 np0005626463.localdomain sudo[42042]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:38 np0005626463.localdomain sudo[42045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:47:38 np0005626463.localdomain sudo[42045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:47:38 np0005626463.localdomain sudo[42045]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:38 np0005626463.localdomain sudo[42060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:47:38 np0005626463.localdomain sudo[42060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:47:39 np0005626463.localdomain sudo[42120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsifboiysufgrnpminowljqwyovhsftp ; /usr/bin/python3
Feb 23 07:47:39 np0005626463.localdomain sudo[42120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:39 np0005626463.localdomain python3[42122]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:39 np0005626463.localdomain sudo[42120]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:39 np0005626463.localdomain sudo[42154]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcciqeyrrilsmxfzoqmihwhtseicwkbr ; /usr/bin/python3
Feb 23 07:47:39 np0005626463.localdomain sudo[42154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:39 np0005626463.localdomain python3[42156]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpzcj0ua9b recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:39 np0005626463.localdomain sudo[42154]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:39 np0005626463.localdomain sudo[42060]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:42 np0005626463.localdomain sudo[42188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:47:42 np0005626463.localdomain sudo[42188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:47:42 np0005626463.localdomain sudo[42188]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:42 np0005626463.localdomain sudo[42216]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smnvjoctkhmlyoimbgjdkshcdtezedzd ; /usr/bin/python3
Feb 23 07:47:42 np0005626463.localdomain sudo[42216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:42 np0005626463.localdomain python3[42218]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:47:44 np0005626463.localdomain sudo[42216]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:46 np0005626463.localdomain sudo[42233]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikeuhybjazgrtqqljgenzabylheoosfi ; /usr/bin/python3
Feb 23 07:47:46 np0005626463.localdomain sudo[42233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:46 np0005626463.localdomain python3[42235]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:47:46 np0005626463.localdomain sudo[42233]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:46 np0005626463.localdomain sudo[42251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btuwbtsjjvuownmktlhveydgtumjcmai ; /usr/bin/python3
Feb 23 07:47:46 np0005626463.localdomain sudo[42251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:47 np0005626463.localdomain python3[42253]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:47:48 np0005626463.localdomain sudo[42251]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:48 np0005626463.localdomain sudo[42269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xamoscbpmsltsdmyalowvbkexinzhuut ; /usr/bin/python3
Feb 23 07:47:48 np0005626463.localdomain sudo[42269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:48 np0005626463.localdomain python3[42271]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:47:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:47:48 np0005626463.localdomain systemd-rc-local-generator[42298]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:47:49 np0005626463.localdomain systemd-sysv-generator[42302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:47:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:47:49 np0005626463.localdomain systemd[1]: Starting Netfilter Tables...
Feb 23 07:47:49 np0005626463.localdomain systemd[1]: Finished Netfilter Tables.
Feb 23 07:47:49 np0005626463.localdomain sudo[42269]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:49 np0005626463.localdomain sudo[42359]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jiqioabgbsrbifvrcxvgtucwyetmuohs ; /usr/bin/python3
Feb 23 07:47:49 np0005626463.localdomain sudo[42359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:49 np0005626463.localdomain python3[42361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:49 np0005626463.localdomain sudo[42359]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:50 np0005626463.localdomain sudo[42402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpiaflbvlwsojochqhrctilvskurrbns ; /usr/bin/python3
Feb 23 07:47:50 np0005626463.localdomain sudo[42402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:50 np0005626463.localdomain python3[42404]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832869.5947292-75156-190287893829712/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:50 np0005626463.localdomain sudo[42402]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:50 np0005626463.localdomain sudo[42432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydrsgzvghlmeuuysrcjpqchkhweuiedp ; /usr/bin/python3
Feb 23 07:47:50 np0005626463.localdomain sudo[42432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:50 np0005626463.localdomain python3[42434]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:47:50 np0005626463.localdomain sudo[42432]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:51 np0005626463.localdomain sudo[42450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtxjacuabtqirzcpiifbvuupokkravzh ; /usr/bin/python3
Feb 23 07:47:51 np0005626463.localdomain sudo[42450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:51 np0005626463.localdomain python3[42452]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:47:51 np0005626463.localdomain sudo[42450]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:51 np0005626463.localdomain sshd[42478]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:47:51 np0005626463.localdomain sshd[42488]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:47:52 np0005626463.localdomain sudo[42502]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxxfizbdqummzabmsiijcudovbrnqvwc ; /usr/bin/python3
Feb 23 07:47:52 np0005626463.localdomain sudo[42502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:52 np0005626463.localdomain sshd[42478]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:47:52 np0005626463.localdomain python3[42504]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:52 np0005626463.localdomain sudo[42502]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:52 np0005626463.localdomain sudo[42546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsymojcyrzrguvtzzddptnhjjulgbyoq ; /usr/bin/python3
Feb 23 07:47:52 np0005626463.localdomain sudo[42546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:52 np0005626463.localdomain python3[42548]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832871.8704898-75283-169528438329383/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:52 np0005626463.localdomain sudo[42546]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:53 np0005626463.localdomain sudo[42608]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuhmwqczjyjzegxvxlfprsailwhlnfeg ; /usr/bin/python3
Feb 23 07:47:53 np0005626463.localdomain sudo[42608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:53 np0005626463.localdomain python3[42610]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:53 np0005626463.localdomain sudo[42608]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:53 np0005626463.localdomain sudo[42651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwljhqddqrcbzxeccgffywbjiwsdkdig ; /usr/bin/python3
Feb 23 07:47:53 np0005626463.localdomain sudo[42651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:53 np0005626463.localdomain sshd[42488]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:47:53 np0005626463.localdomain python3[42653]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832873.1378508-75360-151464270445541/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:53 np0005626463.localdomain sudo[42651]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:54 np0005626463.localdomain sudo[42713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whfdljilwwgoielsudcjhoobipxzzuzo ; /usr/bin/python3
Feb 23 07:47:54 np0005626463.localdomain sudo[42713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:54 np0005626463.localdomain python3[42715]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:54 np0005626463.localdomain sudo[42713]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:54 np0005626463.localdomain sudo[42756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwacijqmbmioxszrqfpdfklhyhhdfoje ; /usr/bin/python3
Feb 23 07:47:54 np0005626463.localdomain sudo[42756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:54 np0005626463.localdomain python3[42758]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832874.1292903-75423-111902914142600/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:54 np0005626463.localdomain sudo[42756]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:55 np0005626463.localdomain sudo[42818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cetvmeutrahnkptvedipmsuzdiaodzfw ; /usr/bin/python3
Feb 23 07:47:55 np0005626463.localdomain sudo[42818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:55 np0005626463.localdomain python3[42820]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:55 np0005626463.localdomain sudo[42818]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:56 np0005626463.localdomain sudo[42861]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cegzhaummipomfnyshogykvkjxhancyt ; /usr/bin/python3
Feb 23 07:47:56 np0005626463.localdomain sudo[42861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:56 np0005626463.localdomain python3[42863]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832875.4343626-75719-78158252116706/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:56 np0005626463.localdomain sudo[42861]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:57 np0005626463.localdomain sudo[42923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qccwvjcbrlejebepzfbhsfhihdlrogka ; /usr/bin/python3
Feb 23 07:47:57 np0005626463.localdomain sudo[42923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:57 np0005626463.localdomain python3[42925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:47:57 np0005626463.localdomain sudo[42923]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:57 np0005626463.localdomain sudo[42966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yegzsprkmrshrxyrdpptxpdmluoarqbd ; /usr/bin/python3
Feb 23 07:47:57 np0005626463.localdomain sudo[42966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:57 np0005626463.localdomain python3[42968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832876.7569578-75771-59745741056343/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:57 np0005626463.localdomain sudo[42966]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:58 np0005626463.localdomain sudo[42996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhreqyuuvrgfxsxryckklxteebunywhz ; /usr/bin/python3
Feb 23 07:47:58 np0005626463.localdomain sudo[42996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:58 np0005626463.localdomain python3[42998]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:47:58 np0005626463.localdomain sudo[42996]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:58 np0005626463.localdomain sudo[43061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlffeanqkyqzlnxiwsqqrvuthvjizmud ; /usr/bin/python3
Feb 23 07:47:58 np0005626463.localdomain sudo[43061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:59 np0005626463.localdomain python3[43063]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:47:59 np0005626463.localdomain sudo[43061]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:59 np0005626463.localdomain sudo[43078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klghelxtbnioeywnmjlitncfhlqujlyw ; /usr/bin/python3
Feb 23 07:47:59 np0005626463.localdomain sudo[43078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:59 np0005626463.localdomain python3[43080]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:47:59 np0005626463.localdomain sudo[43078]: pam_unix(sudo:session): session closed for user root
Feb 23 07:47:59 np0005626463.localdomain sudo[43095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lstwsprsvipgwuqfszztwizhrvwmhcdc ; /usr/bin/python3
Feb 23 07:47:59 np0005626463.localdomain sudo[43095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:47:59 np0005626463.localdomain python3[43097]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:47:59 np0005626463.localdomain sudo[43095]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:00 np0005626463.localdomain sudo[43114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfotcjbnozpmltjisbwygswhwqyucgui ; /usr/bin/python3
Feb 23 07:48:00 np0005626463.localdomain sudo[43114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:00 np0005626463.localdomain python3[43116]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:00 np0005626463.localdomain sudo[43114]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:00 np0005626463.localdomain sudo[43130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inrldsvoydjqryhewdkkfvuhbzmesccg ; /usr/bin/python3
Feb 23 07:48:00 np0005626463.localdomain sudo[43130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:00 np0005626463.localdomain python3[43132]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:00 np0005626463.localdomain sudo[43130]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:00 np0005626463.localdomain sudo[43146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufcmxgkxlrbbbzjyqlvzykqeorwltzbe ; /usr/bin/python3
Feb 23 07:48:00 np0005626463.localdomain sudo[43146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:01 np0005626463.localdomain python3[43148]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:01 np0005626463.localdomain sudo[43146]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:01 np0005626463.localdomain sudo[43162]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfgizxpgjweamrilwmkelskjvejezekg ; /usr/bin/python3
Feb 23 07:48:01 np0005626463.localdomain sudo[43162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:01 np0005626463.localdomain python3[43164]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 23 07:48:02 np0005626463.localdomain sudo[43162]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:02 np0005626463.localdomain sudo[43182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oljhsdrylpxmluqlkxrcgbmxwesufoed ; /usr/bin/python3
Feb 23 07:48:02 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 23 07:48:02 np0005626463.localdomain sudo[43182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:02 np0005626463.localdomain python3[43184]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:48:03 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:48:03 np0005626463.localdomain sudo[43182]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:03 np0005626463.localdomain sudo[43203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdzgzoheworloaiwgpcibwubdniqviiz ; /usr/bin/python3
Feb 23 07:48:03 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 23 07:48:03 np0005626463.localdomain sudo[43203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:03 np0005626463.localdomain python3[43205]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:48:04 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:48:04 np0005626463.localdomain sudo[43203]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:04 np0005626463.localdomain sudo[43224]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isrfkbxbrqiierrmnuzbufzqpyjktpan ; /usr/bin/python3
Feb 23 07:48:04 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 23 07:48:04 np0005626463.localdomain sudo[43224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:05 np0005626463.localdomain python3[43226]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:48:05 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:48:06 np0005626463.localdomain sudo[43224]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:06 np0005626463.localdomain sudo[43245]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijvpapnocidmfqthklnrcxhwarnkgtxk ; /usr/bin/python3
Feb 23 07:48:06 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 23 07:48:06 np0005626463.localdomain sudo[43245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:06 np0005626463.localdomain python3[43247]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:06 np0005626463.localdomain sudo[43245]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:06 np0005626463.localdomain sudo[43261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntsxjhzukzlsqpgioxnmgqpiyvzotron ; /usr/bin/python3
Feb 23 07:48:06 np0005626463.localdomain sudo[43261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:06 np0005626463.localdomain python3[43263]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:06 np0005626463.localdomain sudo[43261]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:06 np0005626463.localdomain sudo[43277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-achjdfhmcqnxzubgnbkjnaqykhybnkwh ; /usr/bin/python3
Feb 23 07:48:06 np0005626463.localdomain sudo[43277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:07 np0005626463.localdomain python3[43279]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:07 np0005626463.localdomain sudo[43277]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:07 np0005626463.localdomain sudo[43293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuhfpgqtlzhxorlplqvhcgipzkhohqtg ; /usr/bin/python3
Feb 23 07:48:07 np0005626463.localdomain sudo[43293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:07 np0005626463.localdomain python3[43295]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:48:07 np0005626463.localdomain sudo[43293]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:07 np0005626463.localdomain sudo[43309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyokuwxvrcxmtjwkhlgwozlrpshusvlc ; /usr/bin/python3
Feb 23 07:48:07 np0005626463.localdomain sudo[43309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:07 np0005626463.localdomain python3[43311]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:07 np0005626463.localdomain sudo[43309]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:08 np0005626463.localdomain sudo[43326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzprfiuysiopxjeyocuzyqvyzesgfzvl ; /usr/bin/python3
Feb 23 07:48:08 np0005626463.localdomain sudo[43326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:08 np0005626463.localdomain python3[43328]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:48:11 np0005626463.localdomain sudo[43326]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:12 np0005626463.localdomain sudo[43343]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nszlwrefezjjupfugxveruoaidcmlbjp ; /usr/bin/python3
Feb 23 07:48:12 np0005626463.localdomain sudo[43343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:12 np0005626463.localdomain python3[43345]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:12 np0005626463.localdomain sudo[43343]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:12 np0005626463.localdomain sudo[43391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wixwkoxrmooolelevxnhewnpudaqkdpj ; /usr/bin/python3
Feb 23 07:48:12 np0005626463.localdomain sudo[43391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:12 np0005626463.localdomain python3[43393]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:12 np0005626463.localdomain sudo[43391]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:12 np0005626463.localdomain sudo[43434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppvkgxzaaxfgejmnsrrszllakbvkbxor ; /usr/bin/python3
Feb 23 07:48:12 np0005626463.localdomain sudo[43434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:13 np0005626463.localdomain python3[43436]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832892.3124158-76451-162393161049223/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:13 np0005626463.localdomain sudo[43434]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:13 np0005626463.localdomain sudo[43464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjwekvuoxvjgsiptythmdhjiglivghbz ; /usr/bin/python3
Feb 23 07:48:13 np0005626463.localdomain sudo[43464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:13 np0005626463.localdomain python3[43466]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:48:13 np0005626463.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 07:48:13 np0005626463.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 23 07:48:13 np0005626463.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 23 07:48:13 np0005626463.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 23 07:48:13 np0005626463.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 23 07:48:13 np0005626463.localdomain kernel: Bridge firewalling registered
Feb 23 07:48:13 np0005626463.localdomain systemd-modules-load[43469]: Inserted module 'br_netfilter'
Feb 23 07:48:13 np0005626463.localdomain systemd-modules-load[43469]: Module 'msr' is built in
Feb 23 07:48:13 np0005626463.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 23 07:48:13 np0005626463.localdomain sudo[43464]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:13 np0005626463.localdomain sudo[43518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acyfkpxrnxoeajkekqzrricdgvighipa ; /usr/bin/python3
Feb 23 07:48:13 np0005626463.localdomain sudo[43518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:14 np0005626463.localdomain python3[43520]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:14 np0005626463.localdomain sudo[43518]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:14 np0005626463.localdomain sudo[43561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exoysswkgqlatklxdnznwsmvleqmgqfl ; /usr/bin/python3
Feb 23 07:48:14 np0005626463.localdomain sudo[43561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:14 np0005626463.localdomain python3[43563]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832893.7593112-76550-127741251738842/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:14 np0005626463.localdomain sudo[43561]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:14 np0005626463.localdomain sudo[43591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jljxmxzxpymbviadmekxgzlaooeejtgl ; /usr/bin/python3
Feb 23 07:48:14 np0005626463.localdomain sudo[43591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:14 np0005626463.localdomain python3[43593]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:14 np0005626463.localdomain sudo[43591]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:15 np0005626463.localdomain sudo[43608]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqbslavwgpzfijhxepllkcbbuqezrsef ; /usr/bin/python3
Feb 23 07:48:15 np0005626463.localdomain sudo[43608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:15 np0005626463.localdomain python3[43610]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:16 np0005626463.localdomain sudo[43608]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:16 np0005626463.localdomain sudo[43626]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkknpcrqekqqqoevojrazlivzjmtdlbb ; /usr/bin/python3
Feb 23 07:48:16 np0005626463.localdomain sudo[43626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:16 np0005626463.localdomain python3[43628]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:16 np0005626463.localdomain sudo[43626]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:16 np0005626463.localdomain sudo[43644]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sashxrdnghvkgihanlgqpmywvujqafra ; /usr/bin/python3
Feb 23 07:48:16 np0005626463.localdomain sudo[43644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:16 np0005626463.localdomain python3[43646]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:16 np0005626463.localdomain sudo[43644]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:17 np0005626463.localdomain sudo[43661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unxjihfjswngzfhdqsirdnzxtiqfwtew ; /usr/bin/python3
Feb 23 07:48:17 np0005626463.localdomain sudo[43661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:17 np0005626463.localdomain python3[43663]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:17 np0005626463.localdomain sudo[43661]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:17 np0005626463.localdomain sudo[43678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coljqorgsbmyroqwmcvftwkggqzqwlwt ; /usr/bin/python3
Feb 23 07:48:17 np0005626463.localdomain sudo[43678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:17 np0005626463.localdomain python3[43680]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:17 np0005626463.localdomain sudo[43678]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:17 np0005626463.localdomain sudo[43695]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twjhxzxjzbimpqjfhgrpsokrnyocrfvm ; /usr/bin/python3
Feb 23 07:48:17 np0005626463.localdomain sudo[43695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:17 np0005626463.localdomain python3[43697]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:17 np0005626463.localdomain sudo[43695]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:17 np0005626463.localdomain sudo[43713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egiydljhzprthtpqquzqohowkjnooiar ; /usr/bin/python3
Feb 23 07:48:17 np0005626463.localdomain sudo[43713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:18 np0005626463.localdomain python3[43715]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:18 np0005626463.localdomain sudo[43713]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:18 np0005626463.localdomain sudo[43731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyiyuwzyepkqfpfwuectkdnibtcljacb ; /usr/bin/python3
Feb 23 07:48:18 np0005626463.localdomain sudo[43731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:18 np0005626463.localdomain python3[43733]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:18 np0005626463.localdomain sudo[43731]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:18 np0005626463.localdomain sudo[43749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beidctainchavoksgktcfiwfvkczkdra ; /usr/bin/python3
Feb 23 07:48:18 np0005626463.localdomain sudo[43749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:18 np0005626463.localdomain python3[43751]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:18 np0005626463.localdomain sudo[43749]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:18 np0005626463.localdomain sudo[43767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwmlzwvmvfwmbaxslhzyzwvzpyncmblh ; /usr/bin/python3
Feb 23 07:48:18 np0005626463.localdomain sudo[43767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:19 np0005626463.localdomain python3[43769]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:19 np0005626463.localdomain sudo[43767]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:19 np0005626463.localdomain sudo[43785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnhbaxdtfdasnziqzfsjjslgajseufkm ; /usr/bin/python3
Feb 23 07:48:19 np0005626463.localdomain sudo[43785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:19 np0005626463.localdomain python3[43787]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:19 np0005626463.localdomain sudo[43785]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:19 np0005626463.localdomain sudo[43803]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pddhdqebceqxfhncplkciiekgevhfiog ; /usr/bin/python3
Feb 23 07:48:19 np0005626463.localdomain sudo[43803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:19 np0005626463.localdomain python3[43805]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:19 np0005626463.localdomain sudo[43803]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:19 np0005626463.localdomain sudo[43821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etmnwsrswkrnsbfdqrtxuabmdepfcxzw ; /usr/bin/python3
Feb 23 07:48:19 np0005626463.localdomain sudo[43821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:20 np0005626463.localdomain python3[43823]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:20 np0005626463.localdomain sudo[43821]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:20 np0005626463.localdomain sudo[43838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbueucaoyqrganmixldtucypqsonupiv ; /usr/bin/python3
Feb 23 07:48:20 np0005626463.localdomain sudo[43838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:20 np0005626463.localdomain python3[43840]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:20 np0005626463.localdomain sudo[43838]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:20 np0005626463.localdomain sudo[43855]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnlczcfbktxuehlwwiyhnkmphvdsmjmw ; /usr/bin/python3
Feb 23 07:48:20 np0005626463.localdomain sudo[43855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:20 np0005626463.localdomain python3[43857]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:20 np0005626463.localdomain sudo[43855]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:20 np0005626463.localdomain sudo[43872]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcnwvxswqzbifjawnxlcbyugwmymzxpd ; /usr/bin/python3
Feb 23 07:48:20 np0005626463.localdomain sudo[43872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:20 np0005626463.localdomain python3[43874]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:20 np0005626463.localdomain sudo[43872]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:21 np0005626463.localdomain sudo[43889]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raiiwpitpgcgnadkeayftptrfpzkwwtv ; /usr/bin/python3
Feb 23 07:48:21 np0005626463.localdomain sudo[43889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:21 np0005626463.localdomain python3[43891]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 23 07:48:21 np0005626463.localdomain sudo[43889]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:21 np0005626463.localdomain sudo[43907]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fntyjivrfkkydiclrhtfmkxecfsajxbu ; /usr/bin/python3
Feb 23 07:48:21 np0005626463.localdomain sudo[43907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:21 np0005626463.localdomain python3[43909]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 23 07:48:21 np0005626463.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 23 07:48:21 np0005626463.localdomain sudo[43907]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:22 np0005626463.localdomain sudo[43927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veujtetzgysfcmmbdwabclcnzjmkjdln ; /usr/bin/python3
Feb 23 07:48:22 np0005626463.localdomain sudo[43927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:22 np0005626463.localdomain python3[43929]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:22 np0005626463.localdomain sudo[43927]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:22 np0005626463.localdomain sudo[43943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxsjquvnhdhsraayonusobrxfsumkzmy ; /usr/bin/python3
Feb 23 07:48:22 np0005626463.localdomain sudo[43943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:22 np0005626463.localdomain python3[43945]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:22 np0005626463.localdomain sudo[43943]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:22 np0005626463.localdomain sudo[43959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfhvorwwpcmwjdkiaqkctjhvbsagtumu ; /usr/bin/python3
Feb 23 07:48:22 np0005626463.localdomain sudo[43959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:22 np0005626463.localdomain python3[43961]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:22 np0005626463.localdomain sudo[43959]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:23 np0005626463.localdomain sudo[43975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvoewtidvyiaehmxjkmrnxzqxpqjrlvb ; /usr/bin/python3
Feb 23 07:48:23 np0005626463.localdomain sudo[43975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:23 np0005626463.localdomain python3[43977]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:48:23 np0005626463.localdomain sudo[43975]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:23 np0005626463.localdomain sudo[43991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjwpriwvsyqhrtohneoseyocvrjvuyxp ; /usr/bin/python3
Feb 23 07:48:23 np0005626463.localdomain sudo[43991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:23 np0005626463.localdomain python3[43993]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:23 np0005626463.localdomain sudo[43991]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:23 np0005626463.localdomain sudo[44007]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjtgmrnvwvfcvtggedaobyjfstgkauxw ; /usr/bin/python3
Feb 23 07:48:23 np0005626463.localdomain sudo[44007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:23 np0005626463.localdomain python3[44009]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:23 np0005626463.localdomain sudo[44007]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:23 np0005626463.localdomain sudo[44023]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzczzhqboiihyacfqvecvtfntsjtcxtg ; /usr/bin/python3
Feb 23 07:48:23 np0005626463.localdomain sudo[44023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:24 np0005626463.localdomain python3[44025]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:24 np0005626463.localdomain sudo[44023]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:24 np0005626463.localdomain sudo[44039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jixarzffcsbfhpxokhnjhemlxqmlfiic ; /usr/bin/python3
Feb 23 07:48:24 np0005626463.localdomain sudo[44039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:24 np0005626463.localdomain python3[44041]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:24 np0005626463.localdomain sudo[44039]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:24 np0005626463.localdomain sudo[44055]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbfvliuoppijdvolticllulvornztwze ; /usr/bin/python3
Feb 23 07:48:24 np0005626463.localdomain sudo[44055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:24 np0005626463.localdomain python3[44057]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:24 np0005626463.localdomain sudo[44055]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:25 np0005626463.localdomain sudo[44103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqwdpmlpdlokwmkzlqbozfihmfttkiqa ; /usr/bin/python3
Feb 23 07:48:25 np0005626463.localdomain sudo[44103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:25 np0005626463.localdomain python3[44105]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:25 np0005626463.localdomain sudo[44103]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:25 np0005626463.localdomain sudo[44146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asjamshzivxirdicffcwwpvgikbvfwac ; /usr/bin/python3
Feb 23 07:48:25 np0005626463.localdomain sudo[44146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:25 np0005626463.localdomain python3[44148]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832904.8883739-76904-214070854249969/source _original_basename=tmpvoha4s0v follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:25 np0005626463.localdomain sudo[44146]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:25 np0005626463.localdomain sudo[44176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfhhsdybxovsipdnbptnirfnpifpeplu ; /usr/bin/python3
Feb 23 07:48:25 np0005626463.localdomain sudo[44176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:25 np0005626463.localdomain python3[44178]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:26 np0005626463.localdomain sudo[44176]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:27 np0005626463.localdomain sudo[44193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbmfmgwfnwujosjjrjknxycismyxxprr ; /usr/bin/python3
Feb 23 07:48:27 np0005626463.localdomain sudo[44193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:27 np0005626463.localdomain python3[44195]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:27 np0005626463.localdomain sudo[44193]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:27 np0005626463.localdomain sudo[44241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plozvxuhgqdswsddfrgwuioltapfwjci ; /usr/bin/python3
Feb 23 07:48:27 np0005626463.localdomain sudo[44241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:27 np0005626463.localdomain python3[44243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:27 np0005626463.localdomain sudo[44241]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:28 np0005626463.localdomain sudo[44284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slvmldngdbmopaepxsrkzmtsecmhmxog ; /usr/bin/python3
Feb 23 07:48:28 np0005626463.localdomain sudo[44284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:28 np0005626463.localdomain python3[44286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832907.6361003-77346-32685908769013/source _original_basename=tmp2pg00k07 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:28 np0005626463.localdomain sudo[44284]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:28 np0005626463.localdomain sudo[44314]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuppkvkpfoftkqalmrophhkysfgcebrb ; /usr/bin/python3
Feb 23 07:48:28 np0005626463.localdomain sudo[44314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:28 np0005626463.localdomain python3[44316]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:28 np0005626463.localdomain sudo[44314]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:29 np0005626463.localdomain sudo[44330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucebxhqnurbxvjonfuydhthwrhcwpved ; /usr/bin/python3
Feb 23 07:48:29 np0005626463.localdomain sudo[44330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:29 np0005626463.localdomain python3[44332]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:29 np0005626463.localdomain sudo[44330]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:29 np0005626463.localdomain sudo[44346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhrdudyqelwhsokvfnoqmabcqkmtpmpm ; /usr/bin/python3
Feb 23 07:48:29 np0005626463.localdomain sudo[44346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:29 np0005626463.localdomain python3[44348]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:29 np0005626463.localdomain sudo[44346]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:29 np0005626463.localdomain sudo[44362]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdfyjygavnagdxeytqffpabyrqlmynmy ; /usr/bin/python3
Feb 23 07:48:29 np0005626463.localdomain sudo[44362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:29 np0005626463.localdomain python3[44364]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:29 np0005626463.localdomain sudo[44362]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:29 np0005626463.localdomain sudo[44378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikxmvjdzsfvefgeqvvspzaziplmzbalu ; /usr/bin/python3
Feb 23 07:48:29 np0005626463.localdomain sudo[44378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:30 np0005626463.localdomain python3[44380]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:30 np0005626463.localdomain sudo[44378]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:30 np0005626463.localdomain sudo[44394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiqksikhauocvblomofqdyzhqrcgcmbt ; /usr/bin/python3
Feb 23 07:48:30 np0005626463.localdomain sudo[44394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:30 np0005626463.localdomain python3[44396]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:30 np0005626463.localdomain sudo[44394]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:30 np0005626463.localdomain sudo[44410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgpqvpujtfzhgqbldkdqokmylonmljjb ; /usr/bin/python3
Feb 23 07:48:30 np0005626463.localdomain sudo[44410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:30 np0005626463.localdomain python3[44412]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:30 np0005626463.localdomain sudo[44410]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:30 np0005626463.localdomain sudo[44426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qatajbhyonwvrvbdkwupqmnhxbuzbbyn ; /usr/bin/python3
Feb 23 07:48:30 np0005626463.localdomain sudo[44426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:31 np0005626463.localdomain python3[44428]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:31 np0005626463.localdomain sudo[44426]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:31 np0005626463.localdomain sudo[44442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrtvtiqvcwrmrydjmdfurjnfvmhtugku ; /usr/bin/python3
Feb 23 07:48:31 np0005626463.localdomain sudo[44442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:31 np0005626463.localdomain python3[44444]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:31 np0005626463.localdomain sudo[44442]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:31 np0005626463.localdomain sudo[44458]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdoacdrevbloxcfifvqkzhetnleuwrqb ; /usr/bin/python3
Feb 23 07:48:31 np0005626463.localdomain sudo[44458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:31 np0005626463.localdomain python3[44460]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Feb 23 07:48:31 np0005626463.localdomain groupadd[44461]: group added to /etc/group: name=qemu, GID=107
Feb 23 07:48:31 np0005626463.localdomain groupadd[44461]: group added to /etc/gshadow: name=qemu
Feb 23 07:48:31 np0005626463.localdomain groupadd[44461]: new group: name=qemu, GID=107
Feb 23 07:48:31 np0005626463.localdomain sudo[44458]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:31 np0005626463.localdomain sudo[44480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntcthfhgklohiwgmbtdsfglkzbenmifg ; /usr/bin/python3
Feb 23 07:48:31 np0005626463.localdomain sudo[44480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:32 np0005626463.localdomain python3[44482]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 23 07:48:32 np0005626463.localdomain useradd[44484]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Feb 23 07:48:32 np0005626463.localdomain sudo[44480]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:32 np0005626463.localdomain sudo[44504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuspyjlxkbsgxnxxqbazkuztflelghbh ; /usr/bin/python3
Feb 23 07:48:32 np0005626463.localdomain sudo[44504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:32 np0005626463.localdomain python3[44506]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Feb 23 07:48:32 np0005626463.localdomain sudo[44504]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:32 np0005626463.localdomain sudo[44520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whfaxbbqghyddmiumublldguxjpyxinh ; /usr/bin/python3
Feb 23 07:48:32 np0005626463.localdomain sudo[44520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:32 np0005626463.localdomain python3[44522]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:32 np0005626463.localdomain sudo[44520]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:33 np0005626463.localdomain sudo[44569]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqeknfxummkecuuannmsgmlljmiuamby ; /usr/bin/python3
Feb 23 07:48:33 np0005626463.localdomain sudo[44569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:33 np0005626463.localdomain python3[44571]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:33 np0005626463.localdomain sudo[44569]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:33 np0005626463.localdomain sudo[44612]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmlmdjeaqiriqzcrlnnflxsiufwayjvm ; /usr/bin/python3
Feb 23 07:48:33 np0005626463.localdomain sudo[44612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:33 np0005626463.localdomain python3[44614]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832913.1796858-77560-186325677768155/source _original_basename=tmp6h7dzj6q follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:33 np0005626463.localdomain sudo[44612]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:34 np0005626463.localdomain sudo[44642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyashrurvryoiwbxtwfrrilpobmaebkk ; /usr/bin/python3
Feb 23 07:48:34 np0005626463.localdomain sudo[44642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:34 np0005626463.localdomain python3[44644]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 23 07:48:34 np0005626463.localdomain sudo[44642]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:35 np0005626463.localdomain sudo[44662]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugtddjsqxmxwdihzpycxtfbufuuevkzw ; /usr/bin/python3
Feb 23 07:48:35 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 23 07:48:35 np0005626463.localdomain sudo[44662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:35 np0005626463.localdomain python3[44664]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:35 np0005626463.localdomain sudo[44662]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:35 np0005626463.localdomain sudo[44678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llwoiplqifgxsfhtfmugbxxuhghjvule ; /usr/bin/python3
Feb 23 07:48:35 np0005626463.localdomain sudo[44678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:35 np0005626463.localdomain python3[44680]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:35 np0005626463.localdomain sudo[44678]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:35 np0005626463.localdomain sudo[44694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgfodondicxkdgzcuzofwglyhgdtancf ; /usr/bin/python3
Feb 23 07:48:35 np0005626463.localdomain sudo[44694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:35 np0005626463.localdomain python3[44696]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Feb 23 07:48:36 np0005626463.localdomain sudo[44694]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:37 np0005626463.localdomain sshd[44704]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:48:37 np0005626463.localdomain sudo[44719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egdsdcudmxcgculsernszfupkxftomfa ; /usr/bin/python3
Feb 23 07:48:37 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 23 07:48:37 np0005626463.localdomain sudo[44719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:37 np0005626463.localdomain python3[44721]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:48:37 np0005626463.localdomain sshd[44704]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:48:40 np0005626463.localdomain sudo[44719]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:40 np0005626463.localdomain sudo[44736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmsoeqjplwmvcexhohddvtqsgjwqipob ; /usr/bin/python3
Feb 23 07:48:40 np0005626463.localdomain sudo[44736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:40 np0005626463.localdomain python3[44738]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 07:48:40 np0005626463.localdomain sudo[44736]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:41 np0005626463.localdomain sudo[44797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgazhfkkzltpmwpealdywhoncgdaivys ; /usr/bin/python3
Feb 23 07:48:41 np0005626463.localdomain sudo[44797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:41 np0005626463.localdomain python3[44799]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:41 np0005626463.localdomain sudo[44797]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:41 np0005626463.localdomain sudo[44813]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xruqrmreiyiaifbkjamfywqexcticnmp ; /usr/bin/python3
Feb 23 07:48:41 np0005626463.localdomain sudo[44813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:41 np0005626463.localdomain python3[44815]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:41 np0005626463.localdomain sudo[44813]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:42 np0005626463.localdomain sudo[44881]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnikiefdhumfjjowjgtesufggisvvban ; /usr/bin/python3
Feb 23 07:48:42 np0005626463.localdomain sudo[44881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:42 np0005626463.localdomain sudo[44867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:48:42 np0005626463.localdomain sudo[44867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:48:42 np0005626463.localdomain sudo[44867]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:42 np0005626463.localdomain sudo[44891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 07:48:42 np0005626463.localdomain sudo[44891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:48:42 np0005626463.localdomain python3[44889]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:42 np0005626463.localdomain sudo[44881]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:42 np0005626463.localdomain sudo[44946]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxxoaqtedhldnzekboljhacdzvfakrnn ; /usr/bin/python3
Feb 23 07:48:42 np0005626463.localdomain sudo[44946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:42 np0005626463.localdomain python3[44948]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832921.9844236-77903-250384013012976/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=553fb9d1f969873d7079b6b23ab84e26a2830710 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:42 np0005626463.localdomain sudo[44946]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:42 np0005626463.localdomain sudo[44891]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:42 np0005626463.localdomain sudo[44983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:48:42 np0005626463.localdomain sudo[44983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:48:42 np0005626463.localdomain sudo[44983]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:43 np0005626463.localdomain sudo[45030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:48:43 np0005626463.localdomain sudo[45030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:48:43 np0005626463.localdomain sudo[45058]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrzqykhgddbumcfjcbvoirzrlplvuzhd ; /usr/bin/python3
Feb 23 07:48:43 np0005626463.localdomain sudo[45058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:43 np0005626463.localdomain python3[45060]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:43 np0005626463.localdomain sudo[45058]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:43 np0005626463.localdomain sudo[45118]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbumjiaaflqiqpqhoafcbsuwxogakvlf ; /usr/bin/python3
Feb 23 07:48:43 np0005626463.localdomain sudo[45118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:43 np0005626463.localdomain python3[45123]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832922.9251978-77955-206199377258367/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:43 np0005626463.localdomain sudo[45118]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:43 np0005626463.localdomain sudo[45030]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:43 np0005626463.localdomain sudo[45165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nplhfvqwmiqchhmbnlqrnotqtzqhgfwy ; /usr/bin/python3
Feb 23 07:48:43 np0005626463.localdomain sudo[45165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:44 np0005626463.localdomain sudo[45167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:48:44 np0005626463.localdomain sudo[45167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:48:44 np0005626463.localdomain sudo[45167]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:44 np0005626463.localdomain python3[45173]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:44 np0005626463.localdomain sudo[45165]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:44 np0005626463.localdomain sudo[45196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwiwstxoduulahyoxcewynnsbhnzcuxo ; /usr/bin/python3
Feb 23 07:48:44 np0005626463.localdomain sudo[45196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:44 np0005626463.localdomain python3[45198]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:44 np0005626463.localdomain sudo[45196]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:44 np0005626463.localdomain sudo[45212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecezeplhezvcfveakivgshqreijwecvm ; /usr/bin/python3
Feb 23 07:48:44 np0005626463.localdomain sudo[45212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:44 np0005626463.localdomain python3[45214]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:44 np0005626463.localdomain sudo[45212]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:44 np0005626463.localdomain sudo[45228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtkfpusrohbghguxegnkoutaorbjtxrk ; /usr/bin/python3
Feb 23 07:48:44 np0005626463.localdomain sudo[45228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:45 np0005626463.localdomain python3[45230]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:45 np0005626463.localdomain sudo[45228]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:45 np0005626463.localdomain sudo[45276]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrkqrqgavamkbjpzkhgwjtjatdugijsb ; /usr/bin/python3
Feb 23 07:48:45 np0005626463.localdomain sudo[45276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:45 np0005626463.localdomain python3[45278]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:45 np0005626463.localdomain sudo[45276]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:46 np0005626463.localdomain sudo[45319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwmbqrfteckxxsexiklfxvkdnbdgzmna ; /usr/bin/python3
Feb 23 07:48:46 np0005626463.localdomain sudo[45319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:46 np0005626463.localdomain python3[45321]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832925.4933586-78133-176510434695991/source _original_basename=tmp3fas637t follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:46 np0005626463.localdomain sudo[45319]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:46 np0005626463.localdomain sudo[45349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oelerejvnyxgtudkujixwcqmlolqxffi ; /usr/bin/python3
Feb 23 07:48:46 np0005626463.localdomain sudo[45349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:46 np0005626463.localdomain python3[45351]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:46 np0005626463.localdomain sudo[45349]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:46 np0005626463.localdomain sudo[45365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfrrhhgcykuxukgbuivgpsifvoisqkax ; /usr/bin/python3
Feb 23 07:48:46 np0005626463.localdomain sudo[45365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:47 np0005626463.localdomain python3[45367]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:48:47 np0005626463.localdomain sudo[45365]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:47 np0005626463.localdomain sudo[45381]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsdjxojhugbzfwmirlnqwdnsneqeycgp ; /usr/bin/python3
Feb 23 07:48:47 np0005626463.localdomain sudo[45381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:47 np0005626463.localdomain python3[45383]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:48:50 np0005626463.localdomain sshd[45385]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:48:50 np0005626463.localdomain sudo[45381]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:51 np0005626463.localdomain sudo[45432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvyixyivuszwtodyukdhflzgbelolzyv ; /usr/bin/python3
Feb 23 07:48:51 np0005626463.localdomain sudo[45432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:51 np0005626463.localdomain sshd[45385]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:48:51 np0005626463.localdomain python3[45434]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:51 np0005626463.localdomain sudo[45432]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:51 np0005626463.localdomain sudo[45477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oesybnndrmyxpjprxdxmugxonrvlmxas ; /usr/bin/python3
Feb 23 07:48:51 np0005626463.localdomain sudo[45477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:51 np0005626463.localdomain python3[45479]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832930.8503578-78353-208169489117578/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:51 np0005626463.localdomain sudo[45477]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:51 np0005626463.localdomain sudo[45508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qchkpqmizorozglsyynlehmdfulpgerp ; /usr/bin/python3
Feb 23 07:48:51 np0005626463.localdomain sudo[45508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:52 np0005626463.localdomain python3[45510]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:48:52 np0005626463.localdomain sshd[1132]: Received signal 15; terminating.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: sshd.service: Consumed 3.936s CPU time, read 1.9M from disk, written 56.0K to disk.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 23 07:48:52 np0005626463.localdomain sshd[45514]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:48:52 np0005626463.localdomain sshd[45514]: Server listening on 0.0.0.0 port 22.
Feb 23 07:48:52 np0005626463.localdomain sshd[45514]: Server listening on :: port 22.
Feb 23 07:48:52 np0005626463.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 23 07:48:52 np0005626463.localdomain sudo[45508]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:52 np0005626463.localdomain sudo[45528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiiftfbfyoycyasuxcfqbvfaladhunpn ; /usr/bin/python3
Feb 23 07:48:52 np0005626463.localdomain sudo[45528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:52 np0005626463.localdomain python3[45530]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:52 np0005626463.localdomain sudo[45528]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:53 np0005626463.localdomain sudo[45546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoaslwknnwheirchdqhgzfqsnrzuunki ; /usr/bin/python3
Feb 23 07:48:53 np0005626463.localdomain sudo[45546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:53 np0005626463.localdomain python3[45548]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:48:53 np0005626463.localdomain sudo[45546]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:53 np0005626463.localdomain sudo[45564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsmpxhozgrhefxiozgybpijyanurctaw ; /usr/bin/python3
Feb 23 07:48:53 np0005626463.localdomain sudo[45564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:54 np0005626463.localdomain python3[45566]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:48:56 np0005626463.localdomain sudo[45564]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:57 np0005626463.localdomain sudo[45613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejgbvmqbbxmvqvafegixccqrmbgiarge ; /usr/bin/python3
Feb 23 07:48:57 np0005626463.localdomain sudo[45613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:57 np0005626463.localdomain python3[45615]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:57 np0005626463.localdomain sudo[45613]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:57 np0005626463.localdomain sudo[45631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zphscbmfpaffqzxgliwflkahyzfpkgdk ; /usr/bin/python3
Feb 23 07:48:57 np0005626463.localdomain sudo[45631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:57 np0005626463.localdomain python3[45633]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:57 np0005626463.localdomain sudo[45631]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:58 np0005626463.localdomain sudo[45661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mybkdvzctykuilpzkfkogxwzvifstkuz ; /usr/bin/python3
Feb 23 07:48:58 np0005626463.localdomain sudo[45661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:58 np0005626463.localdomain python3[45663]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:48:58 np0005626463.localdomain sudo[45661]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:59 np0005626463.localdomain sudo[45711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhpgawzysmvfiyscksomtwepgjsnajja ; /usr/bin/python3
Feb 23 07:48:59 np0005626463.localdomain sudo[45711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:59 np0005626463.localdomain python3[45713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:48:59 np0005626463.localdomain sudo[45711]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:59 np0005626463.localdomain sudo[45729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxhgecrqrjubvbyvznqiwqhdsefsjcuj ; /usr/bin/python3
Feb 23 07:48:59 np0005626463.localdomain sudo[45729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:48:59 np0005626463.localdomain python3[45731]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:48:59 np0005626463.localdomain sudo[45729]: pam_unix(sudo:session): session closed for user root
Feb 23 07:48:59 np0005626463.localdomain sudo[45759]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmpahuadgykiitsexlgkuphwrzrsitlf ; /usr/bin/python3
Feb 23 07:48:59 np0005626463.localdomain sudo[45759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:00 np0005626463.localdomain python3[45761]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:49:00 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:49:00 np0005626463.localdomain systemd-rc-local-generator[45783]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:49:00 np0005626463.localdomain systemd-sysv-generator[45787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:49:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:49:00 np0005626463.localdomain systemd[1]: Starting chronyd online sources service...
Feb 23 07:49:00 np0005626463.localdomain chronyc[45801]: 200 OK
Feb 23 07:49:00 np0005626463.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 23 07:49:00 np0005626463.localdomain systemd[1]: Finished chronyd online sources service.
Feb 23 07:49:00 np0005626463.localdomain sudo[45759]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:00 np0005626463.localdomain sudo[45815]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzcjehjfblfzppfxtrswawnjbexyrivs ; /usr/bin/python3
Feb 23 07:49:00 np0005626463.localdomain sudo[45815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:00 np0005626463.localdomain python3[45817]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:00 np0005626463.localdomain chronyd[25974]: System clock was stepped by 0.000003 seconds
Feb 23 07:49:00 np0005626463.localdomain sudo[45815]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:01 np0005626463.localdomain sudo[45832]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azfedotcemgtsntdyodbttrjxdbcrafu ; /usr/bin/python3
Feb 23 07:49:01 np0005626463.localdomain sudo[45832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:01 np0005626463.localdomain python3[45834]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:01 np0005626463.localdomain sudo[45832]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3257 writes, 16K keys, 3257 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3257 writes, 144 syncs, 22.62 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3257 writes, 16K keys, 3257 commit groups, 1.0 writes per commit group, ingest: 14.67 MB, 0.02 MB/s
                                                          Interval WAL: 3257 writes, 144 syncs, 22.62 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:49:01 np0005626463.localdomain sudo[45849]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzodgqucliurqrkgwzhzjdyujzhgszgb ; /usr/bin/python3
Feb 23 07:49:01 np0005626463.localdomain sudo[45849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:01 np0005626463.localdomain python3[45851]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:01 np0005626463.localdomain chronyd[25974]: System clock was stepped by 0.000000 seconds
Feb 23 07:49:01 np0005626463.localdomain sudo[45849]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:01 np0005626463.localdomain sudo[45866]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbxjrcltyhnmzwqolqegdtjoqihjovqw ; /usr/bin/python3
Feb 23 07:49:01 np0005626463.localdomain sudo[45866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:01 np0005626463.localdomain python3[45868]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:01 np0005626463.localdomain sudo[45866]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:02 np0005626463.localdomain sudo[45883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paowknvjxasqjnhjwikggryhqjfbwgoy ; /usr/bin/python3
Feb 23 07:49:02 np0005626463.localdomain sudo[45883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:02 np0005626463.localdomain python3[45885]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 23 07:49:02 np0005626463.localdomain systemd[1]: Starting Time & Date Service...
Feb 23 07:49:02 np0005626463.localdomain systemd[1]: Started Time & Date Service.
Feb 23 07:49:02 np0005626463.localdomain sudo[45883]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:03 np0005626463.localdomain sudo[45903]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghyrdrfbysdvixaiydrdxcwjbcotlbkq ; /usr/bin/python3
Feb 23 07:49:03 np0005626463.localdomain sudo[45903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:03 np0005626463.localdomain python3[45905]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:03 np0005626463.localdomain sudo[45903]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:04 np0005626463.localdomain sudo[45920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmfrkqongwdnprvdqxjnxszjztobdsul ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 23 07:49:04 np0005626463.localdomain sudo[45920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:04 np0005626463.localdomain python3[45922]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:04 np0005626463.localdomain sudo[45920]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:04 np0005626463.localdomain sudo[45937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcsjgkkhmxjyosqatxglzendwudceisz ; /usr/bin/python3
Feb 23 07:49:04 np0005626463.localdomain sudo[45937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:04 np0005626463.localdomain python3[45939]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 23 07:49:04 np0005626463.localdomain sudo[45937]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:04 np0005626463.localdomain sudo[45953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fegwozuzplrtplcuytyyrlqcoxthxrky ; /usr/bin/python3
Feb 23 07:49:04 np0005626463.localdomain sudo[45953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:05 np0005626463.localdomain python3[45955]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:49:05 np0005626463.localdomain sudo[45953]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:05 np0005626463.localdomain sudo[45969]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzvzkyedcdidsoebgksevuzoyzdbzxig ; /usr/bin/python3
Feb 23 07:49:05 np0005626463.localdomain sudo[45969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:05 np0005626463.localdomain python3[45971]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:49:05 np0005626463.localdomain sudo[45969]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:05 np0005626463.localdomain sudo[45985]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twihkwusaojyfgpuxzwrjpfkldhhyibr ; /usr/bin/python3
Feb 23 07:49:05 np0005626463.localdomain sudo[45985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:06 np0005626463.localdomain python3[45987]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:49:06 np0005626463.localdomain sudo[45985]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3385 writes, 16K keys, 3385 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3385 writes, 196 syncs, 17.27 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3385 writes, 16K keys, 3385 commit groups, 1.0 writes per commit group, ingest: 15.28 MB, 0.03 MB/s
                                                          Interval WAL: 3385 writes, 196 syncs, 17.27 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:49:06 np0005626463.localdomain sudo[46033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jibdvejyccqrwhgmvzbupuwlcqhektbe ; /usr/bin/python3
Feb 23 07:49:06 np0005626463.localdomain sudo[46033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:06 np0005626463.localdomain python3[46035]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:49:06 np0005626463.localdomain sudo[46033]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:06 np0005626463.localdomain sudo[46076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikbahthellfiebasqwqghlymrhyuixjb ; /usr/bin/python3
Feb 23 07:49:06 np0005626463.localdomain sudo[46076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:06 np0005626463.localdomain python3[46078]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832946.1500823-79369-133495529295441/source _original_basename=tmpthq512t7 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:06 np0005626463.localdomain sudo[46076]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:07 np0005626463.localdomain sudo[46138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwpykthmxkayptdnnttwnetmkbgkgodb ; /usr/bin/python3
Feb 23 07:49:07 np0005626463.localdomain sudo[46138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:07 np0005626463.localdomain python3[46140]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:49:07 np0005626463.localdomain sudo[46138]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:07 np0005626463.localdomain sudo[46181]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mougizlpymbkqwxafdhhpmrtyhctxuao ; /usr/bin/python3
Feb 23 07:49:07 np0005626463.localdomain sudo[46181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:07 np0005626463.localdomain python3[46183]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832947.008525-79419-69708615694195/source _original_basename=tmp18s25o1d follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:07 np0005626463.localdomain sudo[46181]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:08 np0005626463.localdomain sudo[46211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hejlzygwdbtzepjhntyigxubueqvtijl ; /usr/bin/python3
Feb 23 07:49:08 np0005626463.localdomain sudo[46211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:08 np0005626463.localdomain python3[46213]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 07:49:08 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:49:08 np0005626463.localdomain systemd-sysv-generator[46242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:49:08 np0005626463.localdomain systemd-rc-local-generator[46238]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:49:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:49:08 np0005626463.localdomain sudo[46211]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:08 np0005626463.localdomain sudo[46265]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cucwlcoizujuubcevlzypmwlqvjkoupm ; /usr/bin/python3
Feb 23 07:49:08 np0005626463.localdomain sudo[46265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:09 np0005626463.localdomain python3[46267]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:49:09 np0005626463.localdomain sudo[46265]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:09 np0005626463.localdomain sudo[46281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paffaiqbxqfztwebbbmtonayahsgjijq ; /usr/bin/python3
Feb 23 07:49:09 np0005626463.localdomain sudo[46281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:09 np0005626463.localdomain python3[46283]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:09 np0005626463.localdomain sudo[46281]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:09 np0005626463.localdomain sudo[46298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxgxrrkmcqzdhnncvhjgmrcnzrextmkh ; /usr/bin/python3
Feb 23 07:49:09 np0005626463.localdomain sudo[46298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:09 np0005626463.localdomain python3[46300]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:09 np0005626463.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Feb 23 07:49:09 np0005626463.localdomain sudo[46298]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:09 np0005626463.localdomain sudo[46315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcwdonqwovoarfetoajjffdocevztjrp ; /usr/bin/python3
Feb 23 07:49:09 np0005626463.localdomain sudo[46315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:10 np0005626463.localdomain python3[46317]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:49:10 np0005626463.localdomain sudo[46315]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:10 np0005626463.localdomain sudo[46331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgoblaedgwzvbhjgmsuvmysssncosuyy ; /usr/bin/python3
Feb 23 07:49:10 np0005626463.localdomain sudo[46331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:10 np0005626463.localdomain python3[46333]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:10 np0005626463.localdomain sudo[46331]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:10 np0005626463.localdomain sudo[46379]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytcqeopfkabrdnrkhydgewyhycnrnsfn ; /usr/bin/python3
Feb 23 07:49:10 np0005626463.localdomain sudo[46379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:10 np0005626463.localdomain python3[46381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:49:10 np0005626463.localdomain sudo[46379]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:11 np0005626463.localdomain sudo[46422]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndmcmwvdiwjowyxneuonyjeubabdefxp ; /usr/bin/python3
Feb 23 07:49:11 np0005626463.localdomain sudo[46422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:11 np0005626463.localdomain python3[46424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832950.6138618-79644-257178056254103/source _original_basename=tmpdxa2_iy_ follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:11 np0005626463.localdomain sudo[46422]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:23 np0005626463.localdomain sshd[46439]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:49:24 np0005626463.localdomain sshd[46439]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:49:32 np0005626463.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 07:49:33 np0005626463.localdomain sudo[46456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qecsioczyearfslpzdgeetixthspqlgc ; /usr/bin/python3
Feb 23 07:49:33 np0005626463.localdomain sudo[46456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:33 np0005626463.localdomain python3[46458]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:49:33 np0005626463.localdomain sudo[46456]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:34 np0005626463.localdomain sudo[46472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odnhfumydevgbmqlvnqvoeabwffebzdy ; /usr/bin/python3
Feb 23 07:49:34 np0005626463.localdomain sudo[46472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:34 np0005626463.localdomain python3[46474]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Feb 23 07:49:34 np0005626463.localdomain sudo[46472]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:34 np0005626463.localdomain sudo[46488]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpstsknbzzwymuxpednwxdokeouzlvmr ; /usr/bin/python3
Feb 23 07:49:34 np0005626463.localdomain sudo[46488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:34 np0005626463.localdomain python3[46490]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:49:34 np0005626463.localdomain sudo[46488]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:34 np0005626463.localdomain sudo[46504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kplsjvogmqbowcmlnxccfcrkghfatmow ; /usr/bin/python3
Feb 23 07:49:34 np0005626463.localdomain sudo[46504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:34 np0005626463.localdomain python3[46506]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:34 np0005626463.localdomain sudo[46504]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:35 np0005626463.localdomain sudo[46520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrhovnoxrnhrhfyqtmuohexyvxmaelyq ; /usr/bin/python3
Feb 23 07:49:35 np0005626463.localdomain sudo[46520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:35 np0005626463.localdomain python3[46522]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:35 np0005626463.localdomain sudo[46520]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:35 np0005626463.localdomain sudo[46536]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axsyclesewkjvkjtgeinzvmtoheclldh ; /usr/bin/python3
Feb 23 07:49:35 np0005626463.localdomain sudo[46536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:35 np0005626463.localdomain python3[46538]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:49:36 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:49:36 np0005626463.localdomain sudo[46536]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:36 np0005626463.localdomain sudo[46557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkzauivaasstxrmqwgurvsivfqfwecuo ; /usr/bin/python3
Feb 23 07:49:36 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 23 07:49:36 np0005626463.localdomain sudo[46557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:36 np0005626463.localdomain python3[46559]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:49:36 np0005626463.localdomain sudo[46557]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:37 np0005626463.localdomain sudo[46573]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omfisrwcgsdkehuwildhugdpervluzot ; /usr/bin/python3
Feb 23 07:49:37 np0005626463.localdomain sudo[46573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:37 np0005626463.localdomain sudo[46573]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:37 np0005626463.localdomain sudo[46621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvrnwkiqacwrklhhyyinbshvbvvezwyo ; /usr/bin/python3
Feb 23 07:49:37 np0005626463.localdomain sudo[46621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:37 np0005626463.localdomain sudo[46621]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:38 np0005626463.localdomain sudo[46664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlfmmteogyshdwrdofuavaqspgkgepsv ; /usr/bin/python3
Feb 23 07:49:38 np0005626463.localdomain sudo[46664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:38 np0005626463.localdomain sudo[46664]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:38 np0005626463.localdomain sudo[46694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-garomvftmercfyvkilpefdrbtewqksff ; /usr/bin/python3
Feb 23 07:49:38 np0005626463.localdomain sudo[46694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:39 np0005626463.localdomain python3[46696]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Feb 23 07:49:39 np0005626463.localdomain sudo[46694]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:39 np0005626463.localdomain rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Feb 23 07:49:39 np0005626463.localdomain sudo[46710]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koosxchwpmedpepllsjujrrboxxnsnvw ; /usr/bin/python3
Feb 23 07:49:39 np0005626463.localdomain sudo[46710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:39 np0005626463.localdomain python3[46712]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:49:39 np0005626463.localdomain sudo[46710]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:39 np0005626463.localdomain sudo[46726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgygocbksctuszrzirfqwyvajovktozw ; /usr/bin/python3
Feb 23 07:49:39 np0005626463.localdomain sudo[46726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:40 np0005626463.localdomain python3[46728]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:49:40 np0005626463.localdomain sudo[46726]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:40 np0005626463.localdomain sudo[46742]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpounjnphfcroqwtehtrpvdqylirvnij ; /usr/bin/python3
Feb 23 07:49:40 np0005626463.localdomain sudo[46742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:40 np0005626463.localdomain python3[46744]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Feb 23 07:49:40 np0005626463.localdomain sudo[46742]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:44 np0005626463.localdomain sudo[46745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:49:44 np0005626463.localdomain sudo[46745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:49:44 np0005626463.localdomain sudo[46745]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:44 np0005626463.localdomain sudo[46760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:49:44 np0005626463.localdomain sudo[46760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:49:44 np0005626463.localdomain sudo[46760]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:45 np0005626463.localdomain sudo[46852]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hffohhpgdxnorsxcjhxpzalbvmhuyfvy ; /usr/bin/python3
Feb 23 07:49:45 np0005626463.localdomain sudo[46852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:45 np0005626463.localdomain python3[46854]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:49:45 np0005626463.localdomain sudo[46852]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:45 np0005626463.localdomain sudo[46895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fheqvyfjeecmbouaosgtbhewoohthgje ; /usr/bin/python3
Feb 23 07:49:45 np0005626463.localdomain sudo[46895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:45 np0005626463.localdomain python3[46897]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832985.2449641-81140-78626805301400/source _original_basename=tmp23rxc6sd follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:49:45 np0005626463.localdomain sudo[46895]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:46 np0005626463.localdomain sudo[46925]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dguavjhnumcxwphvhorswngxyjygkszn ; /usr/bin/python3
Feb 23 07:49:46 np0005626463.localdomain sudo[46925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:46 np0005626463.localdomain python3[46927]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:49:46 np0005626463.localdomain sudo[46925]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:46 np0005626463.localdomain sudo[46930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:49:46 np0005626463.localdomain sudo[46930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:49:46 np0005626463.localdomain sudo[46930]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:47 np0005626463.localdomain sudo[46990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnanlljwtzsaqqlqnapxrieqzkbbpmgy ; /usr/bin/python3
Feb 23 07:49:47 np0005626463.localdomain sudo[46990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:47 np0005626463.localdomain sudo[46990]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:47 np0005626463.localdomain sudo[47033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrycdxkmawsesxgtpptkpiiycuwqcejs ; /usr/bin/python3
Feb 23 07:49:47 np0005626463.localdomain sudo[47033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:47 np0005626463.localdomain systemd[35778]: Created slice User Background Tasks Slice.
Feb 23 07:49:47 np0005626463.localdomain systemd[35778]: Starting Cleanup of User's Temporary Files and Directories...
Feb 23 07:49:47 np0005626463.localdomain systemd[35778]: Finished Cleanup of User's Temporary Files and Directories.
Feb 23 07:49:47 np0005626463.localdomain sudo[47033]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:48 np0005626463.localdomain sudo[47064]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnkyxuftoigdwqwfxelxxyvohnvtijep ; /usr/bin/python3
Feb 23 07:49:48 np0005626463.localdomain sudo[47064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:48 np0005626463.localdomain python3[47066]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:49:48 np0005626463.localdomain sudo[47064]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:49 np0005626463.localdomain sudo[47112]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uthxlvbsswwjdlwxrbfzbvxmtnlmeenx ; /usr/bin/python3
Feb 23 07:49:49 np0005626463.localdomain sudo[47112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:49 np0005626463.localdomain sshd[47115]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:49:49 np0005626463.localdomain sudo[47112]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:49 np0005626463.localdomain sudo[47156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xshrasdbbokptzbmbzxjlgabjmbhtbbo ; /usr/bin/python3
Feb 23 07:49:49 np0005626463.localdomain sudo[47156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:49 np0005626463.localdomain sudo[47156]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:50 np0005626463.localdomain sshd[47115]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:49:50 np0005626463.localdomain sudo[47187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvpeimweaobslbekfgznijpmxzsprqcl ; /usr/bin/python3
Feb 23 07:49:50 np0005626463.localdomain sudo[47187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:50 np0005626463.localdomain python3[47189]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 07:49:50 np0005626463.localdomain sudo[47187]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:52 np0005626463.localdomain sudo[47203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciwfosqxmjiklcpulqkwbspaeuiahmrb ; /usr/bin/python3
Feb 23 07:49:52 np0005626463.localdomain sudo[47203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:52 np0005626463.localdomain python3[47205]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:49:52 np0005626463.localdomain sudo[47203]: pam_unix(sudo:session): session closed for user root
Feb 23 07:49:53 np0005626463.localdomain sudo[47220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiugeipecpffvyavjutbfxlgsopqmpku ; /usr/bin/python3
Feb 23 07:49:53 np0005626463.localdomain sudo[47220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:49:53 np0005626463.localdomain python3[47222]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 07:50:01 np0005626463.localdomain anacron[19051]: Job `cron.weekly' started
Feb 23 07:50:01 np0005626463.localdomain anacron[19051]: Job `cron.weekly' terminated
Feb 23 07:50:02 np0005626463.localdomain dbus-broker-launch[18433]: Noticed file-system modification, trigger reload.
Feb 23 07:50:02 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:50:02 np0005626463.localdomain dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 23 07:50:02 np0005626463.localdomain dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 23 07:50:02 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:50:02 np0005626463.localdomain systemd[1]: Reexecuting.
Feb 23 07:50:02 np0005626463.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 23 07:50:02 np0005626463.localdomain systemd[1]: Detected virtualization kvm.
Feb 23 07:50:02 np0005626463.localdomain systemd[1]: Detected architecture x86-64.
Feb 23 07:50:02 np0005626463.localdomain systemd-rc-local-generator[47277]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:50:02 np0005626463.localdomain systemd-sysv-generator[47281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:50:02 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:50:08 np0005626463.localdomain sshd[47299]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:50:08 np0005626463.localdomain sshd[47299]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 07:50:10 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 07:50:11 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:50:11 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Feb 23 07:50:11 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:50:12 np0005626463.localdomain systemd-rc-local-generator[47369]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:50:12 np0005626463.localdomain systemd-sysv-generator[47375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[618]: Journal stopped
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Stopping Journal Service...
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Stopped Journal Service.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: systemd-journald.service: Consumed 1.766s CPU time.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Starting Journal Service...
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: systemd-udevd.service: Consumed 3.126s CPU time.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[47710]: Journal started
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[47710]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 12.2M, max 314.7M, 302.5M free.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Started Journal Service.
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 23 07:50:12 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 07:50:12 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:50:12 np0005626463.localdomain systemd-udevd[47718]: Using default interface naming scheme 'rhel-9.0'.
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 23 07:50:12 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 07:50:12 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:50:13 np0005626463.localdomain systemd-rc-local-generator[48322]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:50:13 np0005626463.localdomain systemd-sysv-generator[48331]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.342s CPU time.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: run-r1c26139e872348a7b6cc176f2f200207.service: Deactivated successfully.
Feb 23 07:50:13 np0005626463.localdomain systemd[1]: run-ra2dd25815b9c4efc8a03f822211f16fd.service: Deactivated successfully.
Feb 23 07:50:14 np0005626463.localdomain sudo[47220]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:14 np0005626463.localdomain sudo[48716]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edfueynzqxfshuuzjyduclpewsniakfy ; /usr/bin/python3
Feb 23 07:50:14 np0005626463.localdomain sudo[48716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:15 np0005626463.localdomain python3[48718]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Feb 23 07:50:15 np0005626463.localdomain sudo[48716]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:15 np0005626463.localdomain sudo[48735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqcxsmmcfryjtxdxpxxfqlbxsieduyuz ; /usr/bin/python3
Feb 23 07:50:15 np0005626463.localdomain sudo[48735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:15 np0005626463.localdomain python3[48737]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 07:50:15 np0005626463.localdomain sudo[48735]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:16 np0005626463.localdomain sudo[48753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehpxrghpmfquwouxlpnpnggfbczoneru ; /usr/bin/python3
Feb 23 07:50:16 np0005626463.localdomain sudo[48753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:16 np0005626463.localdomain python3[48755]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:50:16 np0005626463.localdomain python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Feb 23 07:50:16 np0005626463.localdomain python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Feb 23 07:50:23 np0005626463.localdomain podman[48768]: 2026-02-23 07:50:16.669682688 +0000 UTC m=+0.039947640 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:50:23 np0005626463.localdomain python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json
Feb 23 07:50:23 np0005626463.localdomain sudo[48753]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:23 np0005626463.localdomain sudo[48868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znguksyljqurmevewwamdqdhmhthsfxy ; /usr/bin/python3
Feb 23 07:50:24 np0005626463.localdomain sudo[48868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:24 np0005626463.localdomain python3[48870]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:50:24 np0005626463.localdomain python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Feb 23 07:50:24 np0005626463.localdomain python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Feb 23 07:50:31 np0005626463.localdomain podman[48883]: 2026-02-23 07:50:24.298567215 +0000 UTC m=+0.044786519 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 07:50:31 np0005626463.localdomain python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json
Feb 23 07:50:31 np0005626463.localdomain sudo[48868]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:31 np0005626463.localdomain sudo[48985]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrtyxuutrdwomwgnskmptwdvwhdlkmdi ; /usr/bin/python3
Feb 23 07:50:31 np0005626463.localdomain sudo[48985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:31 np0005626463.localdomain python3[48987]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:50:31 np0005626463.localdomain python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Feb 23 07:50:31 np0005626463.localdomain python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Feb 23 07:50:44 np0005626463.localdomain sshd[49308]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:50:44 np0005626463.localdomain sshd[49308]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:50:46 np0005626463.localdomain sudo[49321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:50:46 np0005626463.localdomain sudo[49321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:50:46 np0005626463.localdomain sudo[49321]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:46 np0005626463.localdomain sudo[49336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 07:50:46 np0005626463.localdomain sudo[49336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:50:51 np0005626463.localdomain podman[48999]: 2026-02-23 07:50:31.883839947 +0000 UTC m=+0.026881540 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 07:50:51 np0005626463.localdomain python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json
Feb 23 07:50:51 np0005626463.localdomain sudo[48985]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:51 np0005626463.localdomain sudo[49456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljaiirkjiikkwxshteagyxbximowrfte ; /usr/bin/python3
Feb 23 07:50:51 np0005626463.localdomain sudo[49456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:50:51 np0005626463.localdomain systemd[1]: tmp-crun.mvJ2O7.mount: Deactivated successfully.
Feb 23 07:50:51 np0005626463.localdomain podman[49440]: 2026-02-23 07:50:51.635154564 +0000 UTC m=+0.118668395 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-type=git, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z)
Feb 23 07:50:51 np0005626463.localdomain python3[49464]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:50:51 np0005626463.localdomain python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Feb 23 07:50:51 np0005626463.localdomain python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Feb 23 07:50:51 np0005626463.localdomain podman[49440]: 2026-02-23 07:50:51.746252873 +0000 UTC m=+0.229766724 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, name=rhceph, release=1770267347, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True)
Feb 23 07:50:51 np0005626463.localdomain sudo[49336]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:52 np0005626463.localdomain sudo[49542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:50:52 np0005626463.localdomain sudo[49542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:50:52 np0005626463.localdomain sudo[49542]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:52 np0005626463.localdomain sudo[49557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:50:52 np0005626463.localdomain sudo[49557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:50:52 np0005626463.localdomain sudo[49557]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:53 np0005626463.localdomain sudo[49629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:50:53 np0005626463.localdomain sudo[49629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:50:53 np0005626463.localdomain sudo[49629]: pam_unix(sudo:session): session closed for user root
Feb 23 07:50:53 np0005626463.localdomain sshd[49644]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:50:53 np0005626463.localdomain sshd[49644]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:51:04 np0005626463.localdomain podman[49485]: 2026-02-23 07:50:51.783340871 +0000 UTC m=+0.035920803 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 07:51:04 np0005626463.localdomain python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json
Feb 23 07:51:04 np0005626463.localdomain sudo[49456]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:04 np0005626463.localdomain sudo[49697]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oovhdpysgtzixdomtjwccsqqciwgznur ; /usr/bin/python3
Feb 23 07:51:04 np0005626463.localdomain sudo[49697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:04 np0005626463.localdomain python3[49699]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:04 np0005626463.localdomain python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Feb 23 07:51:04 np0005626463.localdomain python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Feb 23 07:51:12 np0005626463.localdomain podman[49712]: 2026-02-23 07:51:04.714589934 +0000 UTC m=+0.042490858 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 23 07:51:12 np0005626463.localdomain python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json
Feb 23 07:51:12 np0005626463.localdomain sudo[49697]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:12 np0005626463.localdomain sudo[49962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqshojdgmnlpelpvyivdvuyctultbzfq ; /usr/bin/python3
Feb 23 07:51:12 np0005626463.localdomain sudo[49962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:12 np0005626463.localdomain python3[49964]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:12 np0005626463.localdomain python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Feb 23 07:51:12 np0005626463.localdomain python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Feb 23 07:51:16 np0005626463.localdomain podman[49977]: 2026-02-23 07:51:12.855533354 +0000 UTC m=+0.041293521 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 23 07:51:16 np0005626463.localdomain python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json
Feb 23 07:51:16 np0005626463.localdomain sudo[49962]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:17 np0005626463.localdomain sudo[50054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjlrkxqamoujpfwxaembbpxqmcokzsvm ; /usr/bin/python3
Feb 23 07:51:17 np0005626463.localdomain sudo[50054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:17 np0005626463.localdomain python3[50056]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:17 np0005626463.localdomain python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Feb 23 07:51:17 np0005626463.localdomain python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Feb 23 07:51:19 np0005626463.localdomain podman[50069]: 2026-02-23 07:51:17.318590851 +0000 UTC m=+0.028555403 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 23 07:51:19 np0005626463.localdomain python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json
Feb 23 07:51:19 np0005626463.localdomain sudo[50054]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:19 np0005626463.localdomain sudo[50145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qopesnzouhlvzmuldintshdstdltvujj ; /usr/bin/python3
Feb 23 07:51:19 np0005626463.localdomain sudo[50145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:19 np0005626463.localdomain python3[50147]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:19 np0005626463.localdomain python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Feb 23 07:51:19 np0005626463.localdomain python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Feb 23 07:51:21 np0005626463.localdomain podman[50161]: 2026-02-23 07:51:19.989754824 +0000 UTC m=+0.042773566 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 23 07:51:21 np0005626463.localdomain python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json
Feb 23 07:51:21 np0005626463.localdomain sudo[50145]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:22 np0005626463.localdomain sudo[50236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxkxqsxhrxhtjdaqnkuhuwwpakxvnekm ; /usr/bin/python3
Feb 23 07:51:22 np0005626463.localdomain sudo[50236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:22 np0005626463.localdomain python3[50238]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:22 np0005626463.localdomain python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Feb 23 07:51:22 np0005626463.localdomain python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Feb 23 07:51:24 np0005626463.localdomain podman[50250]: 2026-02-23 07:51:22.422270718 +0000 UTC m=+0.044129849 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 23 07:51:24 np0005626463.localdomain python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json
Feb 23 07:51:24 np0005626463.localdomain sudo[50236]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:25 np0005626463.localdomain sudo[50327]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuknayqaysoourivwbsdsuotolcvnrdp ; /usr/bin/python3
Feb 23 07:51:25 np0005626463.localdomain sudo[50327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:25 np0005626463.localdomain python3[50329]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:25 np0005626463.localdomain python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Feb 23 07:51:25 np0005626463.localdomain python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Feb 23 07:51:28 np0005626463.localdomain podman[50341]: 2026-02-23 07:51:25.396542765 +0000 UTC m=+0.043706366 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 07:51:28 np0005626463.localdomain python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json
Feb 23 07:51:28 np0005626463.localdomain sudo[50327]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:29 np0005626463.localdomain sudo[50428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kusccbhtqqmegslcmbemjincnwmupqxl ; /usr/bin/python3
Feb 23 07:51:29 np0005626463.localdomain sudo[50428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:29 np0005626463.localdomain python3[50430]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 23 07:51:29 np0005626463.localdomain python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Feb 23 07:51:29 np0005626463.localdomain python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Feb 23 07:51:31 np0005626463.localdomain podman[50442]: 2026-02-23 07:51:29.306750704 +0000 UTC m=+0.048789324 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 23 07:51:31 np0005626463.localdomain python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json
Feb 23 07:51:31 np0005626463.localdomain sudo[50428]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:31 np0005626463.localdomain sudo[50517]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpxcnlxfflcpkbmyqnjwubzeynqageql ; /usr/bin/python3
Feb 23 07:51:31 np0005626463.localdomain sudo[50517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:31 np0005626463.localdomain python3[50519]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:51:31 np0005626463.localdomain sudo[50517]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:32 np0005626463.localdomain sudo[50567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifmdhszlrobccsmixemknsxjwjvgdtgu ; /usr/bin/python3
Feb 23 07:51:32 np0005626463.localdomain sudo[50567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:32 np0005626463.localdomain sudo[50567]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:32 np0005626463.localdomain sudo[50585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opaqvgxaphkixalpuoxfkckecusszejx ; /usr/bin/python3
Feb 23 07:51:32 np0005626463.localdomain sudo[50585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:32 np0005626463.localdomain sudo[50585]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:33 np0005626463.localdomain sudo[50689]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpmgkhokerktkzrctbttbqqqeyocwqvx ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833092.9381783-84129-259736671855792/async_wrapper.py 138320676718 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833092.9381783-84129-259736671855792/AnsiballZ_command.py _
Feb 23 07:51:33 np0005626463.localdomain sudo[50689]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 07:51:33 np0005626463.localdomain ansible-async_wrapper.py[50691]: Invoked with 138320676718 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833092.9381783-84129-259736671855792/AnsiballZ_command.py _
Feb 23 07:51:33 np0005626463.localdomain ansible-async_wrapper.py[50694]: Starting module and watcher
Feb 23 07:51:33 np0005626463.localdomain ansible-async_wrapper.py[50694]: Start watching 50695 (3600)
Feb 23 07:51:33 np0005626463.localdomain ansible-async_wrapper.py[50695]: Start module (50695)
Feb 23 07:51:33 np0005626463.localdomain ansible-async_wrapper.py[50691]: Return async_wrapper task started.
Feb 23 07:51:33 np0005626463.localdomain sudo[50689]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:33 np0005626463.localdomain sudo[50710]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eapmzxlzizkyhdqrvkasnresasyszgou ; /usr/bin/python3
Feb 23 07:51:33 np0005626463.localdomain sudo[50710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:33 np0005626463.localdomain python3[50715]: ansible-ansible.legacy.async_status Invoked with jid=138320676718.50691 mode=status _async_dir=/tmp/.ansible_async
Feb 23 07:51:33 np0005626463.localdomain sudo[50710]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:36 np0005626463.localdomain sshd[50825]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:51:37 np0005626463.localdomain sshd[50825]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]:    (file & line not available)
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]:    (file & line not available)
Feb 23 07:51:37 np0005626463.localdomain puppet-user[50714]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.12 seconds
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Notice: Applied catalog in 0.06 seconds
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Application:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:    Initial environment: production
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:    Converged environment: production
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:          Run mode: user
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Changes:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:             Total: 3
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Events:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:           Success: 3
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:             Total: 3
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Resources:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:           Changed: 3
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:       Out of sync: 3
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:             Total: 10
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Time:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:          Schedule: 0.00
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:              File: 0.00
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:              Exec: 0.02
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:            Augeas: 0.02
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:    Transaction evaluation: 0.05
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:    Catalog application: 0.06
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:    Config retrieval: 0.16
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:          Last run: 1771833098
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:        Filebucket: 0.00
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:             Total: 0.06
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]: Version:
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:            Config: 1771833097
Feb 23 07:51:38 np0005626463.localdomain puppet-user[50714]:            Puppet: 7.10.0
Feb 23 07:51:38 np0005626463.localdomain ansible-async_wrapper.py[50695]: Module complete (50695)
Feb 23 07:51:38 np0005626463.localdomain ansible-async_wrapper.py[50694]: Done in kid B.
Feb 23 07:51:40 np0005626463.localdomain sshd[51071]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:51:43 np0005626463.localdomain sshd[51071]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:51:43 np0005626463.localdomain sudo[51086]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlvdjvognsloiijgflomkiewenntcbtr ; /usr/bin/python3
Feb 23 07:51:43 np0005626463.localdomain sudo[51086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:44 np0005626463.localdomain python3[51088]: ansible-ansible.legacy.async_status Invoked with jid=138320676718.50691 mode=status _async_dir=/tmp/.ansible_async
Feb 23 07:51:44 np0005626463.localdomain sudo[51086]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:44 np0005626463.localdomain sudo[51102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrgyyzrxuklqghvpqyvajdarardfiusm ; /usr/bin/python3
Feb 23 07:51:44 np0005626463.localdomain sudo[51102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:44 np0005626463.localdomain python3[51104]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:51:44 np0005626463.localdomain sudo[51102]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:44 np0005626463.localdomain sudo[51118]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rovltewqeqfeaxqlyirjnlakoevqqbbl ; /usr/bin/python3
Feb 23 07:51:44 np0005626463.localdomain sudo[51118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:45 np0005626463.localdomain python3[51120]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:51:45 np0005626463.localdomain sudo[51118]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:45 np0005626463.localdomain sudo[51166]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbuhmqvhlbrzhvpbwsecugyqshvtzcik ; /usr/bin/python3
Feb 23 07:51:45 np0005626463.localdomain sudo[51166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:45 np0005626463.localdomain python3[51168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:51:45 np0005626463.localdomain sudo[51166]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:45 np0005626463.localdomain sudo[51209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udpcdpfhpckxxivfqrfxscsgidnyciwr ; /usr/bin/python3
Feb 23 07:51:45 np0005626463.localdomain sudo[51209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:46 np0005626463.localdomain python3[51211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833105.2946615-84391-65683064465177/source _original_basename=tmpw3qn043_ follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:51:46 np0005626463.localdomain sudo[51209]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:46 np0005626463.localdomain sudo[51239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlatlgwxsbutwazawktmuihpkglygvcs ; /usr/bin/python3
Feb 23 07:51:46 np0005626463.localdomain sudo[51239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:46 np0005626463.localdomain python3[51241]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:51:46 np0005626463.localdomain sudo[51239]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:46 np0005626463.localdomain sudo[51255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knexkqxngyjibovsrinagdmwflyignjs ; /usr/bin/python3
Feb 23 07:51:46 np0005626463.localdomain sudo[51255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:47 np0005626463.localdomain sudo[51255]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:47 np0005626463.localdomain sudo[51342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phjostclqhxszszzjgzyacscsealzqve ; /usr/bin/python3
Feb 23 07:51:47 np0005626463.localdomain sudo[51342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:47 np0005626463.localdomain python3[51344]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 23 07:51:47 np0005626463.localdomain sudo[51342]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:47 np0005626463.localdomain sudo[51361]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlgvbjjanlxjwzncayahgxqnxkdipesw ; /usr/bin/python3
Feb 23 07:51:47 np0005626463.localdomain sudo[51361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:48 np0005626463.localdomain python3[51363]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 07:51:48 np0005626463.localdomain sudo[51361]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:48 np0005626463.localdomain sudo[51377]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-holvhuymurwyrylvgmyaxushagobdall ; /usr/bin/python3
Feb 23 07:51:48 np0005626463.localdomain sudo[51377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:48 np0005626463.localdomain python3[51379]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005626463 step=1 update_config_hash_only=False
Feb 23 07:51:48 np0005626463.localdomain sudo[51377]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:48 np0005626463.localdomain sudo[51393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-parxckyuhhgvhjcfocybodzdazzgignu ; /usr/bin/python3
Feb 23 07:51:48 np0005626463.localdomain sudo[51393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:49 np0005626463.localdomain python3[51395]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:51:49 np0005626463.localdomain sudo[51393]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:49 np0005626463.localdomain sudo[51409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyuxbzdiuwxjjzrdlsmvdgcnkyprtzum ; /usr/bin/python3
Feb 23 07:51:49 np0005626463.localdomain sudo[51409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:49 np0005626463.localdomain python3[51411]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 07:51:49 np0005626463.localdomain sudo[51409]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:49 np0005626463.localdomain sudo[51425]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqzwfxiimmubvvigjaheukhbgwjepeeu ; /usr/bin/python3
Feb 23 07:51:49 np0005626463.localdomain sudo[51425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:50 np0005626463.localdomain python3[51427]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Feb 23 07:51:50 np0005626463.localdomain sudo[51425]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:50 np0005626463.localdomain sudo[51464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uixyezojdnbkcjjecbzmccfndytqqxfz ; /usr/bin/python3
Feb 23 07:51:50 np0005626463.localdomain sudo[51464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:51:50 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 07:51:51 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:51.271615711 +0000 UTC m=+0.068008691 container create 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=container-puppet-iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:51:51 np0005626463.localdomain podman[51664]: 2026-02-23 07:51:51.306306177 +0000 UTC m=+0.088385513 container create 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510)
Feb 23 07:51:51 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:51.315433701 +0000 UTC m=+0.074187313 container create eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_puppet_step1, container_name=container-puppet-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope.
Feb 23 07:51:51 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:51.330757086 +0000 UTC m=+0.101187061 container create 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, container_name=container-puppet-crond, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true)
Feb 23 07:51:51 np0005626463.localdomain podman[51684]: 2026-02-23 07:51:51.342453639 +0000 UTC m=+0.085646998 container create ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope.
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope.
Feb 23 07:51:51 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:51.243616222 +0000 UTC m=+0.040009232 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope.
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:51 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:51.253307853 +0000 UTC m=+0.023737818 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:51 np0005626463.localdomain podman[51664]: 2026-02-23 07:51:51.266187213 +0000 UTC m=+0.048266569 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:51:51 np0005626463.localdomain podman[51664]: 2026-02-23 07:51:51.366991811 +0000 UTC m=+0.149071157 container init 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, architecture=x86_64, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 07:51:51 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:51.26966054 +0000 UTC m=+0.028414182 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 23 07:51:51 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:51.368768076 +0000 UTC m=+0.127521688 container init eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-type=git, container_name=container-puppet-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope.
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:51 np0005626463.localdomain podman[51664]: 2026-02-23 07:51:51.379079996 +0000 UTC m=+0.161159342 container start 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 23 07:51:51 np0005626463.localdomain podman[51664]: 2026-02-23 07:51:51.382103549 +0000 UTC m=+0.164182905 container attach 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=)
Feb 23 07:51:51 np0005626463.localdomain podman[51684]: 2026-02-23 07:51:51.292272302 +0000 UTC m=+0.035465671 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 07:51:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:52 np0005626463.localdomain podman[51684]: 2026-02-23 07:51:52.338242141 +0000 UTC m=+1.081435560 container init ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, container_name=container-puppet-nova_libvirt, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:51:52 np0005626463.localdomain systemd[1]: tmp-crun.n3i5Fg.mount: Deactivated successfully.
Feb 23 07:51:52 np0005626463.localdomain podman[51684]: 2026-02-23 07:51:52.360708528 +0000 UTC m=+1.103901927 container start ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, release=1766032510, config_id=tripleo_puppet_step1, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13)
Feb 23 07:51:52 np0005626463.localdomain podman[51684]: 2026-02-23 07:51:52.361034498 +0000 UTC m=+1.104227897 container attach ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2026-01-12T23:31:49Z, container_name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 07:51:52 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:52.376616801 +0000 UTC m=+1.173009821 container init 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:34:43Z, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 07:51:52 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:52.388370356 +0000 UTC m=+1.147124008 container start eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-collectd, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 07:51:52 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:52.389282374 +0000 UTC m=+1.148036076 container attach eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, tcib_managed=true, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, container_name=container-puppet-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:51:52 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:52.396963383 +0000 UTC m=+1.193356393 container start 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public)
Feb 23 07:51:52 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:52.398260423 +0000 UTC m=+1.194653473 container attach 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=container-puppet-iscsid)
Feb 23 07:51:52 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:52.439071369 +0000 UTC m=+1.209501344 container init 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, tcib_managed=true, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 07:51:52 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:52.447313015 +0000 UTC m=+1.217743030 container start 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:51:52 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:52.447585224 +0000 UTC m=+1.218015219 container attach 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=container-puppet-crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 07:51:53 np0005626463.localdomain podman[51543]: 2026-02-23 07:51:51.133301009 +0000 UTC m=+0.042195790 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 23 07:51:53 np0005626463.localdomain systemd[1]: tmp-crun.Qptawy.mount: Deactivated successfully.
Feb 23 07:51:53 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:53.396437978 +0000 UTC m=+0.071921893 container create 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:24Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1)
Feb 23 07:51:53 np0005626463.localdomain sudo[51879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:51:53 np0005626463.localdomain sudo[51879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:51:53 np0005626463.localdomain sudo[51879]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:53 np0005626463.localdomain systemd[1]: Started libpod-conmon-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope.
Feb 23 07:51:53 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:53 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:53.351950728 +0000 UTC m=+0.027434613 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 23 07:51:53 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:53.467630827 +0000 UTC m=+0.143114702 container init 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:24Z, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2026-01-12T23:07:24Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Feb 23 07:51:53 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:53.476828992 +0000 UTC m=+0.152312847 container start 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=container-puppet-ceilometer, build-date=2026-01-12T23:07:24Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.5)
Feb 23 07:51:53 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:53.477926427 +0000 UTC m=+0.153410342 container attach 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:07:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=container-puppet-ceilometer, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, url=https://www.redhat.com, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T23:07:24Z, com.redhat.component=openstack-ceilometer-central-container)
Feb 23 07:51:53 np0005626463.localdomain sudo[51897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:51:53 np0005626463.localdomain sudo[51897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:51:54 np0005626463.localdomain sudo[51897]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:54 np0005626463.localdomain ovs-vsctl[52067]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: Accepting previously invalid value for target type 'Integer'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.11 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.07 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]:    (file & line not available)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}29d074022d6cafdf94866dc1f307d9105f785dc4a34888c55632376b0a0d6303'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Notice: Applied catalog in 0.03 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Application:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    Initial environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    Converged environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:          Run mode: user
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Changes:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:             Total: 7
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Events:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:           Success: 7
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:             Total: 7
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Resources:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:           Skipped: 13
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:           Changed: 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:       Out of sync: 5
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:             Total: 20
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Time:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:              File: 0.02
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    Transaction evaluation: 0.03
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    Catalog application: 0.03
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:    Config retrieval: 0.14
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:          Last run: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:             Total: 0.03
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]: Version:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:            Config: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51801]:            Puppet: 7.10.0
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.10 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Feb 23 07:51:54 np0005626463.localdomain crontab[52277]: (root) LIST (root)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Feb 23 07:51:54 np0005626463.localdomain crontab[52281]: (root) REPLACE (root)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Notice: Applied catalog in 0.04 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Application:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    Initial environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    Converged environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:          Run mode: user
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Changes:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:             Total: 2
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Events:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:           Success: 2
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:             Total: 2
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Resources:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:           Changed: 2
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:       Out of sync: 2
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:           Skipped: 7
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:             Total: 9
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Time:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:              File: 0.01
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:              Cron: 0.01
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    Transaction evaluation: 0.04
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    Catalog application: 0.04
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:    Config retrieval: 0.11
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:          Last run: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:             Total: 0.04
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]: Version:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:            Config: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51833]:            Puppet: 7.10.0
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.29 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: in a future release. Use nova::cinder::os_region_name instead
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: in a future release. Use nova::cinder::catalog_info instead
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Consumed 2.104s CPU time.
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Consumed 2.128s CPU time.
Feb 23 07:51:54 np0005626463.localdomain podman[51673]: 2026-02-23 07:51:54.7013199 +0000 UTC m=+3.471749885 container died 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, container_name=container-puppet-crond)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Notice: Applied catalog in 0.21 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Application:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Initial environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Converged environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:          Run mode: user
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Changes:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:             Total: 43
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Events:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:           Success: 43
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:             Total: 43
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Resources:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:           Skipped: 14
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:           Changed: 38
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:       Out of sync: 38
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:             Total: 82
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Time:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:              File: 0.11
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Transaction evaluation: 0.20
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Catalog application: 0.21
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Config retrieval: 0.39
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:          Last run: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:    Concat fragment: 0.00
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:       Concat file: 0.00
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:             Total: 0.21
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]: Version:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:            Config: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51798]:            Puppet: 7.10.0
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Feb 23 07:51:54 np0005626463.localdomain podman[52341]: 2026-02-23 07:51:54.759286269 +0000 UTC m=+0.122292306 container died 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, container_name=container-puppet-metrics_qdr, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: tmp-crun.kXysIb.mount: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Notice: Applied catalog in 0.48 seconds
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Application:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    Initial environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    Converged environment: production
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:          Run mode: user
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Changes:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:             Total: 4
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Events:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:           Success: 4
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:             Total: 4
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Resources:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:           Changed: 4
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:       Out of sync: 4
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:           Skipped: 8
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:             Total: 13
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Time:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:              File: 0.00
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:              Exec: 0.06
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    Config retrieval: 0.13
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:            Augeas: 0.40
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    Transaction evaluation: 0.47
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:    Catalog application: 0.48
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:          Last run: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:             Total: 0.48
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]: Version:
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:            Config: 1771833114
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51819]:            Puppet: 7.10.0
Feb 23 07:51:54 np0005626463.localdomain podman[52341]: 2026-02-23 07:51:54.816079581 +0000 UTC m=+0.179085588 container cleanup 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13)
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-conmon-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain podman[52368]: 2026-02-23 07:51:54.827857797 +0000 UTC m=+0.114482453 container cleanup 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, managed_by=tripleo_ansible, container_name=container-puppet-crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:51:54 np0005626463.localdomain systemd[1]: libpod-conmon-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Deactivated successfully.
Feb 23 07:51:54 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 23 07:51:54 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:51:54 np0005626463.localdomain puppet-user[51806]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Feb 23 07:51:55 np0005626463.localdomain sudo[52430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:51:55 np0005626463.localdomain sudo[52430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:51:55 np0005626463.localdomain sudo[52430]: pam_unix(sudo:session): session closed for user root
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Consumed 2.511s CPU time.
Feb 23 07:51:55 np0005626463.localdomain podman[51642]: 2026-02-23 07:51:55.125975898 +0000 UTC m=+3.922368908 container died 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Consumed 2.589s CPU time.
Feb 23 07:51:55 np0005626463.localdomain podman[51674]: 2026-02-23 07:51:55.175483965 +0000 UTC m=+3.934237627 container died eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-collectd, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:51:55 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:55.205984241 +0000 UTC m=+0.059808887 container create bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.component=openstack-rsyslog-container, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=container-puppet-rsyslog, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: Started libpod-conmon-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:55 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:55.261542725 +0000 UTC m=+0.115367381 container init bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog)
Feb 23 07:51:55 np0005626463.localdomain podman[52543]: 2026-02-23 07:51:55.268185962 +0000 UTC m=+0.134621899 container cleanup 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=container-puppet-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572-merged.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360-merged.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:55.176585269 +0000 UTC m=+0.030409935 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-conmon-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain podman[52582]: 2026-02-23 07:51:55.286367235 +0000 UTC m=+0.100681345 container cleanup eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:51:55 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: libpod-conmon-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Deactivated successfully.
Feb 23 07:51:55 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:55.329929387 +0000 UTC m=+0.183754033 container start bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, config_id=tripleo_puppet_step1, container_name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 07:51:55 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:55.330239137 +0000 UTC m=+0.184063823 container attach bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.13, vcs-type=git, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public)
Feb 23 07:51:55 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:55.332542149 +0000 UTC m=+0.073283876 container create 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ovn_controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, distribution-scope=public)
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: Started libpod-conmon-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope.
Feb 23 07:51:55 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:55 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:55.382801108 +0000 UTC m=+0.123542835 container init 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, container_name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 07:51:55 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:55.394201722 +0000 UTC m=+0.134943439 container start 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=container-puppet-ovn_controller)
Feb 23 07:51:55 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:55.394344686 +0000 UTC m=+0.135086443 container attach 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=container-puppet-ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Feb 23 07:51:55 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 23 07:51:55 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:55.29941463 +0000 UTC m=+0.040156387 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]:    (file & line not available)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]:    (file & line not available)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 1.27 seconds
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.36 seconds
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}489b6455d50f9ee989125e261ff880fe0ce273a5c46439278b09842d2e1f5116'
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Warning: Empty environment setting 'TLS_PASSWORD'
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}8656b3c96dc5b23eeff252eb63947bbb521645e181af749f7bc85fd2f92d7747'
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Feb 23 07:51:55 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Notice: Applied catalog in 0.59 seconds
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Application:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Initial environment: production
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Converged environment: production
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:          Run mode: user
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Changes:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:             Total: 31
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Events:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:           Success: 31
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:             Total: 31
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Resources:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:           Skipped: 22
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:           Changed: 31
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:       Out of sync: 31
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:             Total: 151
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Time:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:           Package: 0.02
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Config retrieval: 0.43
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Ceilometer config: 0.49
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Transaction evaluation: 0.58
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:    Catalog application: 0.59
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:          Last run: 1771833116
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:         Resources: 0.00
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:             Total: 0.59
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]: Version:
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:            Config: 1771833115
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51942]:            Puppet: 7.10.0
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain systemd[1]: libpod-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Deactivated successfully.
Feb 23 07:51:56 np0005626463.localdomain systemd[1]: libpod-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Consumed 3.036s CPU time.
Feb 23 07:51:56 np0005626463.localdomain podman[51868]: 2026-02-23 07:51:56.869771881 +0000 UTC m=+3.545255786 container died 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:24Z, vcs-type=git, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:24Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container)
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Feb 23 07:51:56 np0005626463.localdomain systemd[1]: tmp-crun.TJGeAN.mount: Deactivated successfully.
Feb 23 07:51:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain podman[52837]: 2026-02-23 07:51:57.027589329 +0000 UTC m=+0.143620189 container cleanup 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-central, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, org.opencontainers.image.created=2026-01-12T23:07:24Z, build-date=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1766032510, maintainer=OpenStack TripleO Team)
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: libpod-conmon-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    (file & line not available)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    (file & line not available)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    (file & line not available)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    (file & line not available)
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057-merged.mount: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53007]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}5c90e46cff2762d304afc2b09980a80c0046036381fac743e7540f8aa8df54d3'
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Notice: Applied catalog in 0.10 seconds
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Application:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Initial environment: production
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Converged environment: production
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:          Run mode: user
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Changes:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:             Total: 3
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Events:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:           Success: 3
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:             Total: 3
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Resources:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:           Skipped: 11
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:           Changed: 3
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:       Out of sync: 3
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:             Total: 25
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Time:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:       Concat file: 0.00
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Concat fragment: 0.00
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:              File: 0.01
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Transaction evaluation: 0.09
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Catalog application: 0.10
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:    Config retrieval: 0.25
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:          Last run: 1771833117
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:             Total: 0.10
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]: Version:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:            Config: 1771833117
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52644]:            Puppet: 7.10.0
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53009]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53011]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53019]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005626463.localdomain
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005626463.novalocal' to 'np0005626463.localdomain'
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53022]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53026]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53029]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53031]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53047]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}9109162380f9c461e3d8ec780edb8a48cdd59dabd84e70a5fe7d1088fe416c1b'
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53049]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53056]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:96:08:8c
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: libpod-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: libpod-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Consumed 2.415s CPU time.
Feb 23 07:51:57 np0005626463.localdomain podman[52556]: 2026-02-23 07:51:57.822500915 +0000 UTC m=+2.676325631 container died bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog)
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53064]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53072]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain ovs-vsctl[53078]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Notice: Applied catalog in 0.51 seconds
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Application:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    Initial environment: production
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    Converged environment: production
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:          Run mode: user
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Changes:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:             Total: 14
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Events:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:           Success: 14
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:             Total: 14
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Resources:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:           Skipped: 12
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:           Changed: 14
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:       Out of sync: 14
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:             Total: 29
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Time:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:              Exec: 0.02
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    Config retrieval: 0.27
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:         Vs config: 0.43
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    Transaction evaluation: 0.50
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:    Catalog application: 0.51
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:          Last run: 1771833117
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:             Total: 0.51
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]: Version:
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:            Config: 1771833117
Feb 23 07:51:57 np0005626463.localdomain puppet-user[52737]:            Puppet: 7.10.0
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully.
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Feb 23 07:51:57 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain podman[53065]: 2026-02-23 07:51:58.018105975 +0000 UTC m=+0.181784502 container cleanup bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog)
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: libpod-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Deactivated successfully.
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: libpod-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Consumed 2.834s CPU time.
Feb 23 07:51:58 np0005626463.localdomain podman[52613]: 2026-02-23 07:51:58.47199408 +0000 UTC m=+3.212735817 container died 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, config_id=tripleo_puppet_step1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, container_name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 07:51:58 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: libpod-conmon-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Deactivated successfully.
Feb 23 07:51:58 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: tmp-crun.vSCaLd.mount: Deactivated successfully.
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95-userdata-shm.mount: Deactivated successfully.
Feb 23 07:51:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully.
Feb 23 07:51:59 np0005626463.localdomain podman[53130]: 2026-02-23 07:51:59.142393025 +0000 UTC m=+0.663382128 container cleanup 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 23 07:51:59 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 07:51:59 np0005626463.localdomain systemd[1]: libpod-conmon-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Deactivated successfully.
Feb 23 07:51:59 np0005626463.localdomain podman[52775]: 2026-02-23 07:51:55.673672084 +0000 UTC m=+0.032370055 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain podman[53257]: 2026-02-23 07:51:59.442343683 +0000 UTC m=+0.082704008 container create 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, name=rhosp-rhel9/openstack-neutron-server, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, build-date=2026-01-12T22:57:35Z, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 07:51:59 np0005626463.localdomain systemd[1]: Started libpod-conmon-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope.
Feb 23 07:51:59 np0005626463.localdomain podman[53257]: 2026-02-23 07:51:59.395487538 +0000 UTC m=+0.035847893 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 23 07:51:59 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 23 07:51:59 np0005626463.localdomain podman[53257]: 2026-02-23 07:51:59.512000794 +0000 UTC m=+0.152361119 container init 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, version=17.1.13, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-neutron-server, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:57:35Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:57:35Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container)
Feb 23 07:51:59 np0005626463.localdomain podman[53257]: 2026-02-23 07:51:59.520404955 +0000 UTC m=+0.160765280 container start 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, config_id=tripleo_puppet_step1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-server-container, build-date=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp-rhel9/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=container-puppet-neutron, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 07:51:59 np0005626463.localdomain podman[53257]: 2026-02-23 07:51:59.520695414 +0000 UTC m=+0.161055779 container attach 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, tcib_managed=true, container_name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2026-01-12T22:57:35Z, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, config_id=tripleo_puppet_step1, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 23 07:51:59 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4'
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Notice: Applied catalog in 4.54 seconds
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Application:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Initial environment: production
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Converged environment: production
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:          Run mode: user
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Changes:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:             Total: 183
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Events:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:           Success: 183
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:             Total: 183
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Resources:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:           Changed: 183
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:       Out of sync: 183
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:           Skipped: 57
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:             Total: 487
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Time:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Concat fragment: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:            Anchor: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:         File line: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtlogd config: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtsecretd config: 0.01
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtstoraged config: 0.02
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:              Exec: 0.02
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:              File: 0.02
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtproxyd config: 0.03
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:           Package: 0.03
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtqemud config: 0.03
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Virtnodedevd config: 0.04
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:            Augeas: 0.95
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Config retrieval: 1.50
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:          Last run: 1771833120
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:       Nova config: 3.16
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Transaction evaluation: 4.50
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:    Catalog application: 4.54
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:         Resources: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:       Concat file: 0.00
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:             Total: 4.55
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]: Version:
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:            Config: 1771833114
Feb 23 07:52:00 np0005626463.localdomain puppet-user[51806]:            Puppet: 7.10.0
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: libpod-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Deactivated successfully.
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: libpod-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Consumed 8.410s CPU time.
Feb 23 07:52:01 np0005626463.localdomain podman[53334]: 2026-02-23 07:52:01.195068862 +0000 UTC m=+0.034997536 container died ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, container_name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: tmp-crun.BztZin.mount: Deactivated successfully.
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757-userdata-shm.mount: Deactivated successfully.
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully.
Feb 23 07:52:01 np0005626463.localdomain podman[53334]: 2026-02-23 07:52:01.317274055 +0000 UTC m=+0.157202739 container cleanup ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-type=git, container_name=container-puppet-nova_libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20260112.1)
Feb 23 07:52:01 np0005626463.localdomain systemd[1]: libpod-conmon-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Deactivated successfully.
Feb 23 07:52:01 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]:    (file & line not available)
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]:    (file & line not available)
Feb 23 07:52:01 np0005626463.localdomain puppet-user[53288]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.58 seconds
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Notice: Applied catalog in 0.41 seconds
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Application:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Initial environment: production
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Converged environment: production
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:          Run mode: user
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Changes:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:             Total: 33
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Events:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:           Success: 33
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:             Total: 33
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Resources:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:           Skipped: 21
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:           Changed: 33
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:       Out of sync: 33
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:             Total: 155
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Time:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:         Resources: 0.00
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Ovn metadata agent config: 0.02
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Neutron config: 0.32
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Transaction evaluation: 0.40
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Catalog application: 0.41
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:    Config retrieval: 0.65
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:          Last run: 1771833122
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:             Total: 0.41
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]: Version:
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:            Config: 1771833121
Feb 23 07:52:02 np0005626463.localdomain puppet-user[53288]:            Puppet: 7.10.0
Feb 23 07:52:02 np0005626463.localdomain systemd[1]: libpod-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Deactivated successfully.
Feb 23 07:52:02 np0005626463.localdomain systemd[1]: libpod-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Consumed 3.441s CPU time.
Feb 23 07:52:03 np0005626463.localdomain podman[53257]: 2026-02-23 07:52:03.00313662 +0000 UTC m=+3.643496975 container died 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, container_name=container-puppet-neutron, io.openshift.expose-services=, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-neutron-server, build-date=2026-01-12T22:57:35Z, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 23 07:52:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59-userdata-shm.mount: Deactivated successfully.
Feb 23 07:52:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686-merged.mount: Deactivated successfully.
Feb 23 07:52:03 np0005626463.localdomain podman[53471]: 2026-02-23 07:52:03.132759402 +0000 UTC m=+0.116286309 container cleanup 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=container-puppet-neutron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:57:35Z, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:57:35Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, name=rhosp-rhel9/openstack-neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.13, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 07:52:03 np0005626463.localdomain systemd[1]: libpod-conmon-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Deactivated successfully.
Feb 23 07:52:03 np0005626463.localdomain python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 23 07:52:03 np0005626463.localdomain sudo[51464]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:03 np0005626463.localdomain sudo[53523]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siqnyczmahmyjcuebsyrqbxpscyqmtjz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:03 np0005626463.localdomain sudo[53523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:04 np0005626463.localdomain python3[53525]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:04 np0005626463.localdomain sudo[53523]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:04 np0005626463.localdomain sudo[53539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbcodcfmoqoeesnhsvcdwlloflnewkes ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:04 np0005626463.localdomain sudo[53539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:04 np0005626463.localdomain sudo[53539]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:04 np0005626463.localdomain sudo[53555]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzfbqfvgiorgxykjetgokzikhnlcjqbp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:04 np0005626463.localdomain sudo[53555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:05 np0005626463.localdomain python3[53557]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:52:05 np0005626463.localdomain sudo[53555]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:05 np0005626463.localdomain sudo[53605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dywilpgtevbbdwftaqtlzlwjkfkefwwk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:05 np0005626463.localdomain sudo[53605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:05 np0005626463.localdomain python3[53607]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:05 np0005626463.localdomain sudo[53605]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:05 np0005626463.localdomain sudo[53648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtzdijjdtsccddothgugwsczwjyhyljh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:05 np0005626463.localdomain sudo[53648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:06 np0005626463.localdomain python3[53650]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833125.2457435-84994-250329414396068/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:06 np0005626463.localdomain sudo[53648]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:06 np0005626463.localdomain sudo[53710]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfylcunqcadydhwtarxzyjbzsrnpozyc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:06 np0005626463.localdomain sudo[53710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:06 np0005626463.localdomain python3[53712]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:06 np0005626463.localdomain sudo[53710]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:06 np0005626463.localdomain sudo[53753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhzysheljaciwxtxybxaiulbdkgdujjx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:06 np0005626463.localdomain sudo[53753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:06 np0005626463.localdomain python3[53755]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833126.192573-84994-228560224324970/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:06 np0005626463.localdomain sudo[53753]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:07 np0005626463.localdomain sudo[53815]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdorcwtroshbspmfmfcemwdijcmjweij ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:07 np0005626463.localdomain sudo[53815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:07 np0005626463.localdomain python3[53817]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:07 np0005626463.localdomain sudo[53815]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:07 np0005626463.localdomain sudo[53858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkxloqacvkczifcnufwqeeglcuoffxiy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:07 np0005626463.localdomain sudo[53858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:07 np0005626463.localdomain python3[53860]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833127.080182-85042-154961231096837/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:07 np0005626463.localdomain sudo[53858]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:08 np0005626463.localdomain sudo[53920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpesexnocbigejfugdysqdcvlsnptgyw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:08 np0005626463.localdomain sudo[53920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:08 np0005626463.localdomain python3[53922]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:08 np0005626463.localdomain sudo[53920]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:08 np0005626463.localdomain sudo[53963]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llaporikwdvihfahymkkyjgubiqhfrew ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:08 np0005626463.localdomain sudo[53963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:08 np0005626463.localdomain python3[53965]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833127.9265833-85065-71420798557053/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:08 np0005626463.localdomain sudo[53963]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:08 np0005626463.localdomain sudo[53993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlwymoxrwivqxngpiwuijaayuztahztq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:08 np0005626463.localdomain sudo[53993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:09 np0005626463.localdomain python3[53995]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:09 np0005626463.localdomain systemd-sysv-generator[54020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:09 np0005626463.localdomain systemd-rc-local-generator[54014]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:09 np0005626463.localdomain systemd-rc-local-generator[54057]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:09 np0005626463.localdomain systemd-sysv-generator[54062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: Starting TripleO Container Shutdown...
Feb 23 07:52:09 np0005626463.localdomain systemd[1]: Finished TripleO Container Shutdown.
Feb 23 07:52:09 np0005626463.localdomain sudo[53993]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:09 np0005626463.localdomain sudo[54115]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjzyeuopfxqnbcttjuooiqcadnwklgxz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:09 np0005626463.localdomain sudo[54115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:10 np0005626463.localdomain python3[54117]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:10 np0005626463.localdomain sudo[54115]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:10 np0005626463.localdomain sudo[54158]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxurxxsnxhjlhrcbggblyrzsjbqzbfpj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:10 np0005626463.localdomain sudo[54158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:10 np0005626463.localdomain python3[54160]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833129.831725-85146-209533956272378/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:10 np0005626463.localdomain sudo[54158]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:10 np0005626463.localdomain sudo[54220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbyyxqxtixtmkunssxuwdnlcxixmazma ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:10 np0005626463.localdomain sudo[54220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:11 np0005626463.localdomain python3[54222]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:52:11 np0005626463.localdomain sudo[54220]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:11 np0005626463.localdomain sudo[54263]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzrjyuegszcuzeymlmsrhehmqsyvqqlj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:11 np0005626463.localdomain sudo[54263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:11 np0005626463.localdomain python3[54265]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833130.6995149-85160-67469161869531/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:11 np0005626463.localdomain sudo[54263]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:11 np0005626463.localdomain sudo[54293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynxquoyisxroqrughzrsfsaozrdnmdos ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:11 np0005626463.localdomain sudo[54293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:11 np0005626463.localdomain python3[54295]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:52:11 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:12 np0005626463.localdomain systemd-rc-local-generator[54319]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:12 np0005626463.localdomain systemd-sysv-generator[54322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:12 np0005626463.localdomain systemd-rc-local-generator[54358]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:12 np0005626463.localdomain systemd-sysv-generator[54361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 07:52:12 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 07:52:12 np0005626463.localdomain sudo[54293]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:12 np0005626463.localdomain sudo[54386]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxncbciwcdhhltynjplxsvemhrtklwul ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:12 np0005626463.localdomain sudo[54386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 90a8871bd317528138d212bd0375f6aa
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 45772c82d00b8348e0440509154d74a9
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 8e5028e38f7077561ef1e3e50ec174a3
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 44281c742f88411d75916a4e58499720
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 44281c742f88411d75916a4e58499720
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: cf62475d9880911ecf982eff6ab572ad
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e
Feb 23 07:52:12 np0005626463.localdomain sudo[54386]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:13 np0005626463.localdomain sudo[54402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcvkvhixlvtemddcfsdibarozeijtqha ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:13 np0005626463.localdomain sudo[54402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:13 np0005626463.localdomain sudo[54402]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:14 np0005626463.localdomain sudo[54444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjcmxjwhklaqakoysdpzricsdcpuucsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:14 np0005626463.localdomain sudo[54444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:14 np0005626463.localdomain python3[54446]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.876395215 +0000 UTC m=+0.088663072 container create 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=)
Feb 23 07:52:14 np0005626463.localdomain systemd[1]: Started libpod-conmon-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope.
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.834874307 +0000 UTC m=+0.047142194 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:52:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:52:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb85f8b1f9f1bb7644ed891399fb297bad9a6f983f4d7e10e6f8474d89d107e3/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.952360992 +0000 UTC m=+0.164628849 container init 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, build-date=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.962972112 +0000 UTC m=+0.175239969 container start 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.963273431 +0000 UTC m=+0.175541338 container attach 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr_init_logs, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 07:52:14 np0005626463.localdomain systemd[1]: libpod-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope: Deactivated successfully.
Feb 23 07:52:14 np0005626463.localdomain podman[54483]: 2026-02-23 07:52:14.970736973 +0000 UTC m=+0.183004810 container died 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr_init_logs, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 07:52:15 np0005626463.localdomain podman[54502]: 2026-02-23 07:52:15.061609063 +0000 UTC m=+0.077287049 container cleanup 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr_init_logs, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: libpod-conmon-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope: Deactivated successfully.
Feb 23 07:52:15 np0005626463.localdomain python3[54446]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Feb 23 07:52:15 np0005626463.localdomain podman[54579]: 2026-02-23 07:52:15.55293999 +0000 UTC m=+0.080141888 container create f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true)
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: Started libpod-conmon-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope.
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:52:15 np0005626463.localdomain podman[54579]: 2026-02-23 07:52:15.512143544 +0000 UTC m=+0.039345512 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:52:15 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 23 07:52:15 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:52:15 np0005626463.localdomain podman[54579]: 2026-02-23 07:52:15.640983412 +0000 UTC m=+0.168185310 container init f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible)
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:52:15 np0005626463.localdomain sudo[54600]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 07:52:15 np0005626463.localdomain sudo[54600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Feb 23 07:52:15 np0005626463.localdomain podman[54579]: 2026-02-23 07:52:15.675551294 +0000 UTC m=+0.202753202 container start f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, release=1766032510, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 07:52:15 np0005626463.localdomain python3[54446]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=90a8871bd317528138d212bd0375f6aa --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 23 07:52:15 np0005626463.localdomain sudo[54600]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:15 np0005626463.localdomain podman[54601]: 2026-02-23 07:52:15.765760694 +0000 UTC m=+0.081054536 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible)
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eb85f8b1f9f1bb7644ed891399fb297bad9a6f983f4d7e10e6f8474d89d107e3-merged.mount: Deactivated successfully.
Feb 23 07:52:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f-userdata-shm.mount: Deactivated successfully.
Feb 23 07:52:15 np0005626463.localdomain sudo[54444]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:15 np0005626463.localdomain podman[54601]: 2026-02-23 07:52:15.995198613 +0000 UTC m=+0.310492465 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 23 07:52:16 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:52:16 np0005626463.localdomain sudo[54674]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkvderavvcmvipowyofdbyfkkhbpihhn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:16 np0005626463.localdomain sudo[54674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:16 np0005626463.localdomain python3[54676]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:16 np0005626463.localdomain sudo[54674]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:16 np0005626463.localdomain sudo[54690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcxwbavimmlbbabasgixsrffmrrdrsar ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:16 np0005626463.localdomain sudo[54690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:16 np0005626463.localdomain python3[54692]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:52:16 np0005626463.localdomain sudo[54690]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:17 np0005626463.localdomain sudo[54751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blwoerpckoqqfwafiayjuvkjykwecgax ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:17 np0005626463.localdomain sudo[54751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:17 np0005626463.localdomain python3[54753]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833136.6384125-85348-131987307837601/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:17 np0005626463.localdomain sudo[54751]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:17 np0005626463.localdomain sudo[54767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfptxgvjvpktdtzloaplgdvejgeffott ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:17 np0005626463.localdomain sudo[54767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:17 np0005626463.localdomain python3[54769]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 07:52:17 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:17 np0005626463.localdomain systemd-rc-local-generator[54791]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:17 np0005626463.localdomain systemd-sysv-generator[54797]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:17 np0005626463.localdomain sudo[54767]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:18 np0005626463.localdomain sshd[54806]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:52:18 np0005626463.localdomain sudo[54821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtqpekiiioafpacvdgwdpzmkxanackmr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:52:18 np0005626463.localdomain sudo[54821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:18 np0005626463.localdomain python3[54823]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:52:18 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:52:18 np0005626463.localdomain sshd[54806]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:52:18 np0005626463.localdomain systemd-sysv-generator[54851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:52:18 np0005626463.localdomain systemd-rc-local-generator[54847]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:52:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:52:18 np0005626463.localdomain systemd[1]: Starting metrics_qdr container...
Feb 23 07:52:18 np0005626463.localdomain systemd[1]: Started metrics_qdr container.
Feb 23 07:52:18 np0005626463.localdomain sudo[54821]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:19 np0005626463.localdomain sudo[54900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhscfbfziizafszthdpwfbysxffbqyhf ; /usr/bin/python3
Feb 23 07:52:19 np0005626463.localdomain sudo[54900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:19 np0005626463.localdomain python3[54902]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:19 np0005626463.localdomain sudo[54900]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:19 np0005626463.localdomain sudo[54948]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpvcwhvarieoydnzctynfpocosqsxaoy ; /usr/bin/python3
Feb 23 07:52:19 np0005626463.localdomain sudo[54948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:19 np0005626463.localdomain sudo[54948]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:20 np0005626463.localdomain sudo[54991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wydwsyznawihcjpgewqliaeinsxpjxle ; /usr/bin/python3
Feb 23 07:52:20 np0005626463.localdomain sudo[54991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:20 np0005626463.localdomain sudo[54991]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:20 np0005626463.localdomain sudo[55021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwqcieatyjotvdrcgikoqhvimbnbyzxs ; /usr/bin/python3
Feb 23 07:52:20 np0005626463.localdomain sudo[55021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:20 np0005626463.localdomain python3[55023]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005626463 step=1 update_config_hash_only=False
Feb 23 07:52:20 np0005626463.localdomain sudo[55021]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:21 np0005626463.localdomain sudo[55037]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpjpcnqgtxtjfvyryxmhmgyqsuchiirm ; /usr/bin/python3
Feb 23 07:52:21 np0005626463.localdomain sudo[55037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:21 np0005626463.localdomain python3[55039]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:52:21 np0005626463.localdomain sudo[55037]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:21 np0005626463.localdomain sudo[55053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqdzwshlzotrzmxezfkaxpibykvukirg ; /usr/bin/python3
Feb 23 07:52:21 np0005626463.localdomain sudo[55053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:52:21 np0005626463.localdomain python3[55055]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 07:52:21 np0005626463.localdomain sudo[55053]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:40 np0005626463.localdomain sshd[55056]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:52:41 np0005626463.localdomain sshd[55056]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:52:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:52:46 np0005626463.localdomain systemd[1]: tmp-crun.iVbmdT.mount: Deactivated successfully.
Feb 23 07:52:46 np0005626463.localdomain podman[55058]: 2026-02-23 07:52:46.930019352 +0000 UTC m=+0.098481529 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, version=17.1.13, batch=17.1_20260112.1, container_name=metrics_qdr)
Feb 23 07:52:47 np0005626463.localdomain podman[55058]: 2026-02-23 07:52:47.165247212 +0000 UTC m=+0.333709419 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:52:47 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:52:55 np0005626463.localdomain sudo[55088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:52:55 np0005626463.localdomain sudo[55088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:52:55 np0005626463.localdomain sudo[55088]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:55 np0005626463.localdomain sudo[55103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:52:55 np0005626463.localdomain sudo[55103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:52:56 np0005626463.localdomain sudo[55103]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:56 np0005626463.localdomain sudo[55150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:52:56 np0005626463.localdomain sudo[55150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:52:56 np0005626463.localdomain sudo[55150]: pam_unix(sudo:session): session closed for user root
Feb 23 07:52:58 np0005626463.localdomain sshd[55165]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:52:58 np0005626463.localdomain sshd[55165]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:53:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:53:17 np0005626463.localdomain podman[55167]: 2026-02-23 07:53:17.920229013 +0000 UTC m=+0.091409036 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 07:53:18 np0005626463.localdomain podman[55167]: 2026-02-23 07:53:18.114964232 +0000 UTC m=+0.286144295 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Feb 23 07:53:18 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:53:39 np0005626463.localdomain sshd[55196]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:53:40 np0005626463.localdomain sshd[55196]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:53:40 np0005626463.localdomain sshd[55198]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:53:40 np0005626463.localdomain sshd[55198]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:53:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:53:48 np0005626463.localdomain systemd[1]: tmp-crun.5zsS08.mount: Deactivated successfully.
Feb 23 07:53:48 np0005626463.localdomain podman[55200]: 2026-02-23 07:53:48.910436471 +0000 UTC m=+0.086378874 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 23 07:53:49 np0005626463.localdomain podman[55200]: 2026-02-23 07:53:49.125115048 +0000 UTC m=+0.301057411 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com)
Feb 23 07:53:49 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:53:56 np0005626463.localdomain sudo[55229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:53:56 np0005626463.localdomain sudo[55229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:53:56 np0005626463.localdomain sudo[55229]: pam_unix(sudo:session): session closed for user root
Feb 23 07:53:56 np0005626463.localdomain sudo[55244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:53:56 np0005626463.localdomain sudo[55244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:53:57 np0005626463.localdomain sudo[55244]: pam_unix(sudo:session): session closed for user root
Feb 23 07:53:58 np0005626463.localdomain sudo[55292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:53:58 np0005626463.localdomain sudo[55292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:53:58 np0005626463.localdomain sudo[55292]: pam_unix(sudo:session): session closed for user root
Feb 23 07:54:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:54:19 np0005626463.localdomain systemd[1]: tmp-crun.kR7Ivp.mount: Deactivated successfully.
Feb 23 07:54:19 np0005626463.localdomain podman[55307]: 2026-02-23 07:54:19.91578942 +0000 UTC m=+0.091147202 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, com.redhat.component=openstack-qdrouterd-container)
Feb 23 07:54:20 np0005626463.localdomain podman[55307]: 2026-02-23 07:54:20.123338095 +0000 UTC m=+0.298695887 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Feb 23 07:54:20 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:54:20 np0005626463.localdomain sshd[55336]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:54:21 np0005626463.localdomain sshd[55336]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:54:36 np0005626463.localdomain sshd[55338]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:54:36 np0005626463.localdomain sshd[55338]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:54:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:54:50 np0005626463.localdomain systemd[1]: tmp-crun.CwkCtg.mount: Deactivated successfully.
Feb 23 07:54:50 np0005626463.localdomain podman[55340]: 2026-02-23 07:54:50.908167573 +0000 UTC m=+0.082886835 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Feb 23 07:54:51 np0005626463.localdomain podman[55340]: 2026-02-23 07:54:51.122596946 +0000 UTC m=+0.297316188 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Feb 23 07:54:51 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:54:58 np0005626463.localdomain sudo[55369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:54:58 np0005626463.localdomain sudo[55369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:54:58 np0005626463.localdomain sudo[55369]: pam_unix(sudo:session): session closed for user root
Feb 23 07:54:58 np0005626463.localdomain sudo[55384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:54:58 np0005626463.localdomain sudo[55384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:54:59 np0005626463.localdomain sudo[55384]: pam_unix(sudo:session): session closed for user root
Feb 23 07:54:59 np0005626463.localdomain sudo[55430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:54:59 np0005626463.localdomain sudo[55430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:54:59 np0005626463.localdomain sudo[55430]: pam_unix(sudo:session): session closed for user root
Feb 23 07:55:02 np0005626463.localdomain sshd[55445]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:55:03 np0005626463.localdomain sshd[55445]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:55:19 np0005626463.localdomain sshd[55447]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:55:20 np0005626463.localdomain sshd[55447]: Invalid user teste from 80.94.95.116 port 20234
Feb 23 07:55:21 np0005626463.localdomain sshd[55447]: Connection closed by invalid user teste 80.94.95.116 port 20234 [preauth]
Feb 23 07:55:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:55:21 np0005626463.localdomain podman[55449]: 2026-02-23 07:55:21.364617151 +0000 UTC m=+0.074052225 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, vendor=Red Hat, Inc.)
Feb 23 07:55:21 np0005626463.localdomain podman[55449]: 2026-02-23 07:55:21.588482762 +0000 UTC m=+0.297917796 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public)
Feb 23 07:55:21 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:55:31 np0005626463.localdomain sshd[55478]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:55:32 np0005626463.localdomain sshd[55478]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:55:44 np0005626463.localdomain sshd[55480]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:55:44 np0005626463.localdomain sshd[55480]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:55:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:55:51 np0005626463.localdomain systemd[1]: tmp-crun.QrwgpO.mount: Deactivated successfully.
Feb 23 07:55:51 np0005626463.localdomain podman[55482]: 2026-02-23 07:55:51.908510504 +0000 UTC m=+0.085814295 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 23 07:55:52 np0005626463.localdomain podman[55482]: 2026-02-23 07:55:52.105337781 +0000 UTC m=+0.282641612 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 07:55:52 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:56:00 np0005626463.localdomain sudo[55512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:56:00 np0005626463.localdomain sudo[55512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:56:00 np0005626463.localdomain sudo[55512]: pam_unix(sudo:session): session closed for user root
Feb 23 07:56:00 np0005626463.localdomain sudo[55527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:56:00 np0005626463.localdomain sudo[55527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:56:00 np0005626463.localdomain sudo[55527]: pam_unix(sudo:session): session closed for user root
Feb 23 07:56:01 np0005626463.localdomain sudo[55573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:56:01 np0005626463.localdomain sudo[55573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:56:01 np0005626463.localdomain sudo[55573]: pam_unix(sudo:session): session closed for user root
Feb 23 07:56:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:56:22 np0005626463.localdomain systemd[1]: tmp-crun.w3okcp.mount: Deactivated successfully.
Feb 23 07:56:22 np0005626463.localdomain podman[55589]: 2026-02-23 07:56:22.913600585 +0000 UTC m=+0.084202375 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 07:56:23 np0005626463.localdomain podman[55589]: 2026-02-23 07:56:23.145064013 +0000 UTC m=+0.315665763 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 07:56:23 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:56:25 np0005626463.localdomain sshd[55618]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:56:26 np0005626463.localdomain sshd[55618]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:56:27 np0005626463.localdomain sshd[55620]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:56:29 np0005626463.localdomain sshd[55620]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:56:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:56:53 np0005626463.localdomain podman[55622]: 2026-02-23 07:56:53.902347499 +0000 UTC m=+0.080881762 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 07:56:54 np0005626463.localdomain podman[55622]: 2026-02-23 07:56:54.089437915 +0000 UTC m=+0.267972138 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1)
Feb 23 07:56:54 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:57:01 np0005626463.localdomain sudo[55652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:57:01 np0005626463.localdomain sudo[55652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:01 np0005626463.localdomain sudo[55652]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:01 np0005626463.localdomain sudo[55667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:57:01 np0005626463.localdomain sudo[55667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:02 np0005626463.localdomain sudo[55667]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:02 np0005626463.localdomain sudo[55714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:57:02 np0005626463.localdomain sudo[55714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:02 np0005626463.localdomain sudo[55714]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:06 np0005626463.localdomain sshd[55729]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:57:07 np0005626463.localdomain sshd[55729]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:57:22 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [3,5,1] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:23 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,3,1] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:24 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,3,1] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:24 np0005626463.localdomain sshd[55731]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:57:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:57:24 np0005626463.localdomain podman[55732]: 2026-02-23 07:57:24.906289379 +0000 UTC m=+0.079972914 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public)
Feb 23 07:57:25 np0005626463.localdomain sshd[55731]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:57:25 np0005626463.localdomain podman[55732]: 2026-02-23 07:57:25.124535559 +0000 UTC m=+0.298219134 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public)
Feb 23 07:57:25 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:57:26 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,0,5] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:27 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,4,3] r=0 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 27 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27 pruub=12.108013153s) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active pruub 1113.961425781s@ mbc={}] start_peering_interval up [5,3,1] -> [5,3,1], acting [5,3,1] -> [5,3,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 27 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27 pruub=12.108013153s) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown pruub 1113.961425781s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 27 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=27 pruub=9.998037338s) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 active pruub 1111.856933594s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 27 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=27 pruub=9.994277954s) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1111.856933594s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,4,3] r=0 lpr=26 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.19( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.19( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.18( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.16( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.17( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.16( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.17( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.18( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.14( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.15( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.15( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.12( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.13( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.12( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.11( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.10( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.10( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.11( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.f( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.e( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.13( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.f( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.e( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.14( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.d( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.c( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.a( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.c( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.d( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.a( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.b( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.3( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.2( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.b( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.2( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.7( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.6( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.3( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.4( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.5( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.4( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.5( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.8( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.6( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.7( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.8( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.9( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.9( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1b( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1a( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1a( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1b( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1d( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1c( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1c( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1d( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1f( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1e( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1f( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1e( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.0( empty local-lis/les=27/28 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.9( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:32 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Feb 23 07:57:32 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Feb 23 07:57:33 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Feb 23 07:57:33 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.19( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933093071s) [4,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.913330078s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,2], acting [3,5,1] -> [4,0,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.19( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933013916s) [4,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.913330078s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924542427s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [4,2,3], acting [5,3,1] -> [4,2,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924520493s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.16( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935929298s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916870117s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.927214622s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.908081055s@ mbc={}] start_peering_interval up [5,3,1] -> [3,1,5], acting [5,3,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.16( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935905457s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916870117s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923992157s) [1,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.904907227s@ mbc={}] start_peering_interval up [5,3,1] -> [1,0,2], acting [5,3,1] -> [1,0,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923963547s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.18( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935873985s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916992188s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,4], acting [3,5,1] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923938751s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.18( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935873985s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.916992188s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923839569s) [1,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.904907227s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.15( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935940742s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917114258s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,3], acting [3,5,1] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.15( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935940742s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917114258s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.926757812s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.908081055s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.17( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935349464s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916748047s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923473358s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905151367s@ mbc={}] start_peering_interval up [5,3,1] -> [2,3,4], acting [5,3,1] -> [2,3,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923303604s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [3,2,4], acting [5,3,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923446655s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905151367s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.14( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935741425s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917480469s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,1], acting [3,5,1] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923200607s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924057961s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.14( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935695648s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917480469s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924035072s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906005859s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923510551s) [3,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905517578s@ mbc={}] start_peering_interval up [5,3,1] -> [3,4,2], acting [5,3,1] -> [3,4,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923488617s) [3,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905517578s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.12( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935044289s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917114258s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,3], acting [3,5,1] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.12( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935044289s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917114258s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.11( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935156822s) [0,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917236328s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923770905s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [3,5,4], acting [5,3,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923749924s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906005859s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.11( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935102463s) [0,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917236328s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.10( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934917450s) [2,1,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917236328s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.10( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934895515s) [2,1,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917236328s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923784256s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906372070s@ mbc={}] start_peering_interval up [5,3,1] -> [2,3,1], acting [5,3,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923726082s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906372070s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.13( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934187889s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916870117s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935432434s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917846680s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934660912s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917480469s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,3], acting [3,5,1] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935007095s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917846680s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934618950s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917480469s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924374580s) [0,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907348633s@ mbc={}] start_peering_interval up [5,3,1] -> [0,5,4], acting [5,3,1] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924353600s) [0,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907348633s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923598289s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906738281s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,5], acting [5,3,1] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934491158s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [5,0,1], acting [3,5,1] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922766685s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,4], acting [5,3,1] -> [5,0,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923577309s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906738281s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934491158s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917724609s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.13( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933991432s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916870117s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922766685s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.906005859s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934110641s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918090820s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,4], acting [3,5,1] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934110641s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.918090820s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922724724s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906860352s@ mbc={}] start_peering_interval up [5,3,1] -> [3,2,4], acting [5,3,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933381081s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,0], acting [3,5,1] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922611237s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906860352s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933361053s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917724609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923115730s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907348633s@ mbc={}] start_peering_interval up [5,3,1] -> [3,1,5], acting [5,3,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.17( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933684349s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916748047s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.a( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933485985s) [2,4,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,3], acting [3,5,1] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922876358s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907714844s@ mbc={}] start_peering_interval up [5,3,1] -> [4,0,5], acting [5,3,1] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922852516s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907714844s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.3( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933574677s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918457031s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.3( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933526993s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918457031s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922780991s) [4,5,0] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907836914s@ mbc={}] start_peering_interval up [5,3,1] -> [4,5,0], acting [5,3,1] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922744751s) [4,5,0] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907836914s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922776222s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [4,3,2], acting [5,3,1] -> [4,3,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.a( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933062553s) [2,4,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917724609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922705650s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.6( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932945251s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.6( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932924271s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924454689s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.909912109s@ mbc={}] start_peering_interval up [5,3,1] -> [4,5,3], acting [5,3,1] -> [4,5,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.2( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933115005s) [3,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918701172s@ mbc={}] start_peering_interval up [3,5,1] -> [3,1,2], acting [3,5,1] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.2( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933100700s) [3,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918701172s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924364090s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.909912109s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922254562s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,5], acting [5,3,1] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921727180s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907348633s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921961784s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932245255s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932209969s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923923492s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910156250s@ mbc={}] start_peering_interval up [5,3,1] -> [1,2,3], acting [5,3,1] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.5( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932508469s) [2,4,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918701172s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.4( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932271957s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.5( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932435036s) [2,4,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918701172s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.4( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931917191s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923786163s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.910156250s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.9( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932497025s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.919067383s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.9( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932441711s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.919067383s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921246529s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [4,0,5], acting [5,3,1] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.8( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932141304s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918945312s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921147346s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.8( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932101250s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918945312s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924057007s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910888672s@ mbc={}] start_peering_interval up [5,3,1] -> [2,1,0], acting [5,3,1] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.7( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.936571121s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923095703s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,1], acting [3,5,1] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924020767s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.910888672s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.7( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.936212540s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923095703s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923654556s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910766602s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,1], acting [5,3,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922659874s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.909423828s@ mbc={}] start_peering_interval up [5,3,1] -> [3,4,5], acting [5,3,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923654556s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.910766602s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924242020s) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911376953s@ mbc={}] start_peering_interval up [5,3,1] -> [5,1,0], acting [5,3,1] -> [5,1,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931631088s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.919067383s@ mbc={}] start_peering_interval up [3,5,1] -> [5,0,4], acting [3,5,1] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924242020s) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.911376953s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931631088s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.919067383s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922058105s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.909423828s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924921036s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.912475586s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,1], acting [5,3,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924921036s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.912475586s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935391426s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923095703s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935353279s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923095703s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924047470s) [0,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911987305s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,2], acting [5,3,1] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924012184s) [0,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.911987305s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935465813s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935432434s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923787117s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911987305s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935186386s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923679352s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.911987305s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935120583s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934896469s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934829712s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923625946s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.912475586s@ mbc={}] start_peering_interval up [5,3,1] -> [4,2,3], acting [5,3,1] -> [4,2,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923505783s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.912475586s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.8( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.a( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.c( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1c( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.13( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.f( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.e( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.5( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.15( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.10( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1d( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.8( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.11( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,1,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.16( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1c( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,4,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.e( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.18( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.19( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,0,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,3,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1f( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,3,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1e( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.2( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,1,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.d( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.13( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,4,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.14( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.14( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.7( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.6( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.4( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.4( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.1d( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.19( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,0,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.b( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.11( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.1b( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1d( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.d( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1b( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1a( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.12( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.15( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.18( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.13( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.c( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.5( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.8( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.a( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.f( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.10( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.15( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.1c( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.e( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:57:36 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.e scrub starts
Feb 23 07:57:38 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Feb 23 07:57:39 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.e scrub ok
Feb 23 07:57:39 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.a scrub starts
Feb 23 07:57:39 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.a scrub ok
Feb 23 07:57:40 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.d scrub starts
Feb 23 07:57:40 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.d scrub ok
Feb 23 07:57:40 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Feb 23 07:57:40 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Feb 23 07:57:41 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.c scrub starts
Feb 23 07:57:41 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.c scrub ok
Feb 23 07:57:44 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Feb 23 07:57:44 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 33 pg[6.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [4,0,2] r=2 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:44 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Feb 23 07:57:45 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 34 pg[7.0( empty local-lis/les=0/0 n=0 ec=34/34 lis/c=0/0 les/c/f=0/0/0 sis=34) [0,5,4] r=1 lpr=34 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:57:45 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Feb 23 07:57:45 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Feb 23 07:57:47 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Feb 23 07:57:47 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Feb 23 07:57:47 np0005626463.localdomain sshd[55764]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:57:48 np0005626463.localdomain sshd[55764]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:57:50 np0005626463.localdomain sudo[55766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:57:50 np0005626463.localdomain sudo[55766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:50 np0005626463.localdomain sudo[55766]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:51 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Feb 23 07:57:51 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Feb 23 07:57:52 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.f scrub starts
Feb 23 07:57:52 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.f scrub ok
Feb 23 07:57:52 np0005626463.localdomain sudo[55781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:57:52 np0005626463.localdomain sudo[55781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:52 np0005626463.localdomain sudo[55781]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:53 np0005626463.localdomain sudo[55796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:57:53 np0005626463.localdomain sudo[55796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:57:53 np0005626463.localdomain sudo[55796]: pam_unix(sudo:session): session closed for user root
Feb 23 07:57:54 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1a deep-scrub starts
Feb 23 07:57:54 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1a deep-scrub ok
Feb 23 07:57:54 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Feb 23 07:57:54 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Feb 23 07:57:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:57:56 np0005626463.localdomain podman[55811]: 2026-02-23 07:57:56.312204291 +0000 UTC m=+0.090153830 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Feb 23 07:57:56 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.b scrub starts
Feb 23 07:57:56 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.b scrub ok
Feb 23 07:57:56 np0005626463.localdomain podman[55811]: 2026-02-23 07:57:56.531525135 +0000 UTC m=+0.309474734 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, container_name=metrics_qdr, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 07:57:56 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:57:57 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Feb 23 07:57:57 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Feb 23 07:57:58 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.18 deep-scrub starts
Feb 23 07:57:58 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.18 deep-scrub ok
Feb 23 07:57:59 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Feb 23 07:57:59 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Feb 23 07:58:00 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts
Feb 23 07:58:00 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok
Feb 23 07:58:01 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Feb 23 07:58:01 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Feb 23 07:58:03 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Feb 23 07:58:03 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Feb 23 07:58:05 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Feb 23 07:58:05 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Feb 23 07:58:06 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Feb 23 07:58:06 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Feb 23 07:58:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Feb 23 07:58:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Feb 23 07:58:18 np0005626463.localdomain sudo[55853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukytyprbhhwttirixhfubxgheslzoczc ; /usr/bin/python3
Feb 23 07:58:18 np0005626463.localdomain sudo[55853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:18 np0005626463.localdomain python3[55855]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:18 np0005626463.localdomain sudo[55853]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:20 np0005626463.localdomain sudo[55869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnpeluoedfisedezdjaxikdqyjtvzhcz ; /usr/bin/python3
Feb 23 07:58:20 np0005626463.localdomain sudo[55869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:20 np0005626463.localdomain python3[55871]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:20 np0005626463.localdomain sudo[55869]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:22 np0005626463.localdomain sudo[55885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljamgwunhwavsnthhbohbasciszmhmis ; /usr/bin/python3
Feb 23 07:58:22 np0005626463.localdomain sudo[55885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:22 np0005626463.localdomain python3[55887]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:22 np0005626463.localdomain sudo[55885]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:23 np0005626463.localdomain sshd[55888]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:58:25 np0005626463.localdomain sudo[55934]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rldxjvtrlnmmjtarsecizkdghfziuqbc ; /usr/bin/python3
Feb 23 07:58:25 np0005626463.localdomain sudo[55934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:25 np0005626463.localdomain python3[55936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:25 np0005626463.localdomain sudo[55934]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:25 np0005626463.localdomain sudo[55977]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjlwwkqzzrylpcloyirpabexjdcfbpjo ; /usr/bin/python3
Feb 23 07:58:25 np0005626463.localdomain sudo[55977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:25 np0005626463.localdomain python3[55979]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833504.8338923-92678-100508287553593/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=bb97f2335ebfccbfb2bd8d50bbb589ce7e034c5d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:25 np0005626463.localdomain sudo[55977]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:58:26 np0005626463.localdomain systemd[1]: tmp-crun.WQt1PZ.mount: Deactivated successfully.
Feb 23 07:58:26 np0005626463.localdomain podman[55994]: 2026-02-23 07:58:26.904970549 +0000 UTC m=+0.081157908 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 23 07:58:27 np0005626463.localdomain podman[55994]: 2026-02-23 07:58:27.106402048 +0000 UTC m=+0.282589387 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 23 07:58:27 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:58:27 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 39 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=39 pruub=10.236914635s) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 active pruub 1171.792724609s@ mbc={}] start_peering_interval up [4,0,5] -> [4,0,5], acting [4,0,5] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:27 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 39 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=39 pruub=10.234679222s) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1171.792724609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:27 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 39 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39 pruub=12.557024956s) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active pruub 1178.992553711s@ mbc={}] start_peering_interval up [2,4,3] -> [2,4,3], acting [2,4,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:27 np0005626463.localdomain sshd[56024]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:58:27 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 39 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39 pruub=12.557024956s) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.992553711s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain sshd[56024]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1a( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1d( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.18( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.e( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.f( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.2( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.4( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.6( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.7( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.5( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.b( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.d( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.8( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.16( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1c( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.17( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.15( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.c( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.13( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.10( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.11( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.9( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.3( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.a( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.19( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.12( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1b( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1f( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1e( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.14( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=39/40 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:28 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 41 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=34/35 n=22 ec=34/34 lis/c=34/34 les/c/f=35/35/0 sis=41 pruub=11.574728966s) [0,5,4] r=1 lpr=41 pi=[34,41)/1 luod=0'0 lua=36'37 crt=36'39 lcod 36'38 mlcod 0'0 active pruub 1175.151367188s@ mbc={}] start_peering_interval up [0,5,4] -> [0,5,4], acting [0,5,4] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 41 pg[7.0( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=34/34 lis/c=34/34 les/c/f=35/35/0 sis=41 pruub=11.572754860s) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 lcod 36'38 mlcod 0'0 unknown NOTIFY pruub 1175.151367188s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 41 pg[6.0( empty local-lis/les=33/34 n=0 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.716135979s) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1179.187866211s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,2], acting [4,0,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:29 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 41 pg[6.0( empty local-lis/les=33/34 n=0 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.713320732s) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1179.187866211s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:30 np0005626463.localdomain sudo[56071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktsjjecaazhuilphtspdbsrzanjswkug ; /usr/bin/python3
Feb 23 07:58:30 np0005626463.localdomain sudo[56071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:30 np0005626463.localdomain python3[56073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:30 np0005626463.localdomain sudo[56071]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:30 np0005626463.localdomain sudo[56114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvmjvifwbziwvygeafaedytnktqkrqnv ; /usr/bin/python3
Feb 23 07:58:30 np0005626463.localdomain sudo[56114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:30 np0005626463.localdomain python3[56116]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833510.1481686-92678-160076057041176/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=04bfb06bbb9d2445e353d8ca8467b47fb8316e81 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:30 np0005626463.localdomain sudo[56114]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.2( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.d( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.7( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.4( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.5( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.f( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.e( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.8( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.c( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.6( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.a( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.9( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1a( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.13( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.5( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.7( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1c( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.2( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.11( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.16( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.6( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1d( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1b( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.f( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.a( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.12( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.15( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.9( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.d( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.c( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.e( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.8( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.18( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.b( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.17( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.10( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.14( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.3( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.19( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.4( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1f( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:31 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1e( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:32 np0005626463.localdomain sshd[55888]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1e( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828355789s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764892578s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637487411s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [1,2,3], acting [4,0,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828296661s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764892578s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637428284s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.2( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835462570s) [0,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642700195s@ mbc={}] start_peering_interval up [4,0,2] -> [0,2,1], acting [4,0,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827306747s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764160156s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827263832s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764160156s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.6( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640380859s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.4( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640395164s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.6( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640380859s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.577514648s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.4( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640355110s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640630722s) [0,4,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.578002930s@ mbc={}] start_peering_interval up [4,0,5] -> [0,4,2], acting [4,0,5] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826497078s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.763916016s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640596390s) [0,4,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.578002930s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.834444046s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642822266s@ mbc={}] start_peering_interval up [4,0,2] -> [3,1,2], acting [4,0,2] -> [3,1,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826431274s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.763916016s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828287125s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765869141s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826339722s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.763916016s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828247070s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765869141s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826306343s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.763916016s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835981369s) [3,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [3,4,2], acting [4,0,2] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.8( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643124580s) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580932617s@ mbc={}] start_peering_interval up [4,0,5] -> [5,3,4], acting [4,0,5] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.8( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643124580s) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580932617s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826300621s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764160156s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828134537s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.766235352s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828027725s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.766235352s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826264381s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764160156s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.17( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639589310s) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577880859s@ mbc={}] start_peering_interval up [4,0,5] -> [1,3,5], acting [4,0,5] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835575104s) [3,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.17( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639551163s) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577880859s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.833481789s) [0,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642700195s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.15( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641688347s) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580200195s@ mbc={}] start_peering_interval up [4,0,5] -> [5,4,0], acting [4,0,5] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.15( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641688347s) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580200195s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.14( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.646197319s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.14( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.646197319s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.584716797s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.833578110s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642822266s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.17( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,4,0] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641355515s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [1,0,5], acting [4,0,5] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641232491s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.19( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.13( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640675545s) [0,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580078125s@ mbc={}] start_peering_interval up [4,0,5] -> [0,2,1], acting [4,0,5] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.10( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640784264s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580200195s@ mbc={}] start_peering_interval up [4,0,5] -> [1,2,3], acting [4,0,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.13( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640604973s) [0,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580078125s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.10( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640713692s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580200195s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.12( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1f( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.644660950s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [1,0,5], acting [4,0,5] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832584381s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [0,4,2], acting [4,0,2] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.644611359s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832528114s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1c( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,1,3] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832357407s) [1,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [1,0,5], acting [4,0,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832250595s) [1,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.9( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639661789s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.9( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639661789s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580322266s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639704704s) [0,1,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580566406s@ mbc={}] start_peering_interval up [4,0,5] -> [0,1,2], acting [4,0,5] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832991600s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [0,4,2], acting [4,0,2] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832948685s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639678001s) [0,1,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580566406s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823955536s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765014648s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823917389s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765014648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831695557s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [3,1,2], acting [4,0,2] -> [3,1,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831587791s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1b( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831339836s) [1,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [1,3,2], acting [4,0,2] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643181801s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,1], acting [4,0,5] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.7( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635969162s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831272125s) [1,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.7( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635932922s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643082619s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831761360s) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [2,0,4], acting [4,0,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642868996s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [3,1,5], acting [4,0,5] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.19( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639142990s) [2,1,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.581054688s@ mbc={}] start_peering_interval up [4,0,5] -> [2,1,0], acting [4,0,5] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642843246s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831761360s) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.644287109s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.19( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639110565s) [2,1,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.581054688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.12( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642658234s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584594727s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.12( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642630577s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584594727s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.3( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638156891s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,1], acting [4,0,5] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.3( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.3( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638125420s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831203461s) [0,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [0,5,1], acting [4,0,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831161499s) [0,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635065079s) [3,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577636719s@ mbc={}] start_peering_interval up [4,0,5] -> [3,2,1], acting [4,0,5] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.5( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634901047s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [3,5,1], acting [4,0,5] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.5( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634852409s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635020256s) [3,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577636719s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.11( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637431145s) [4,0,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [4,0,2], acting [4,0,5] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.11( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637386322s) [4,0,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.2( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630867004s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,4], acting [4,0,5] -> [2,3,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630843163s) [2,0,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [2,0,1], acting [4,0,5] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.2( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630802155s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630814552s) [2,0,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.830554008s) [0,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [0,1,2], acting [4,0,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1d( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630595207s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573852539s@ mbc={}] start_peering_interval up [4,0,5] -> [3,5,1], acting [4,0,5] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630550385s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573852539s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634633064s) [2,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.578002930s@ mbc={}] start_peering_interval up [4,0,5] -> [2,1,3], acting [4,0,5] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629957199s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573486328s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,4], acting [4,0,5] -> [2,3,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634598732s) [2,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.578002930s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.830190659s) [0,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629925728s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573486328s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.18( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636716843s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580444336s@ mbc={}] start_peering_interval up [4,0,5] -> [3,1,5], acting [4,0,5] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629552841s) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573242188s@ mbc={}] start_peering_interval up [4,0,5] -> [3,4,5], acting [4,0,5] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.18( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636683464s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580444336s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629494667s) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573242188s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1c( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829218864s) [4,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [4,3,2], acting [4,0,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829623222s) [5,0,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644775391s@ mbc={}] start_peering_interval up [4,0,2] -> [5,0,4], acting [4,0,2] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829169273s) [4,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644287109s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829527855s) [5,0,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636437416s) [1,0,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452026367s@ mbc={}] start_peering_interval up [2,4,3] -> [1,0,5], acting [2,4,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640314102s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.455932617s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828975677s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,0], acting [4,0,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640275002s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.455932617s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636365891s) [1,0,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452026367s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636495590s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452148438s@ mbc={}] start_peering_interval up [2,4,3] -> [2,0,4], acting [2,4,3] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828919411s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636495590s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1183.452148438s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828634262s) [4,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [4,3,5], acting [4,0,2] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637408257s) [4,5,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [4,5,0], acting [2,4,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637366295s) [4,5,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828404427s) [4,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [4,2,3], acting [4,0,2] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828297615s) [4,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644287109s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637184143s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637134552s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636083603s) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [0,1,2], acting [2,4,3] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636673927s) [4,2,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453002930s@ mbc={}] start_peering_interval up [2,4,3] -> [4,2,0], acting [2,4,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636037827s) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827363014s) [5,4,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [5,4,0], acting [4,0,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636640549s) [4,2,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453002930s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827264786s) [5,4,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635220528s) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.451782227s@ mbc={}] start_peering_interval up [2,4,3] -> [0,4,2], acting [2,4,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635187149s) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.451782227s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827570915s) [3,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644165039s@ mbc={}] start_peering_interval up [4,0,2] -> [3,2,4], acting [4,0,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827534676s) [3,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644165039s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828581810s) [4,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635999680s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452880859s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639410973s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456298828s@ mbc={}] start_peering_interval up [2,4,3] -> [2,0,4], acting [2,4,3] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635934830s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452880859s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639410973s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1183.456298828s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635552406s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452514648s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635453224s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452514648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826209068s) [3,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [3,5,1], acting [4,0,2] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826155663s) [3,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635988235s) [3,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [3,1,2], acting [2,4,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635951996s) [3,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826085091s) [5,0,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [5,0,1], acting [4,0,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638475418s) [3,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456054688s@ mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826013565s) [5,0,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638434410s) [3,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.456054688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635375977s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453002930s@ mbc={}] start_peering_interval up [2,4,3] -> [0,5,1], acting [2,4,3] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.12( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635248184s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453002930s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636285782s) [1,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454345703s@ mbc={}] start_peering_interval up [2,4,3] -> [1,3,2], acting [2,4,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636236191s) [1,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454345703s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.13( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.825163841s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643188477s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,3], acting [4,0,2] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634002686s) [3,1,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [3,1,5], acting [2,4,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826061249s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,5], acting [4,0,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633947372s) [3,1,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826021194s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824807167s) [5,3,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643188477s@ mbc={}] start_peering_interval up [4,0,2] -> [5,3,1], acting [4,0,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.b( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824769974s) [5,3,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643188477s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.627758026s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.446411133s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.627714157s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.446411133s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824810028s) [1,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [1,3,5], acting [4,0,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824954033s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643188477s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633456230s) [4,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452270508s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,5], acting [2,4,3] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824737549s) [1,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633417130s) [4,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452270508s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.4( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823819160s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642822266s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,5], acting [4,0,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823763847s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642822266s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635361671s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454467773s@ mbc={}] start_peering_interval up [2,4,3] -> [3,4,5], acting [2,4,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635313988s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454467773s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634619713s) [5,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454101562s@ mbc={}] start_peering_interval up [2,4,3] -> [5,3,4], acting [2,4,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634549141s) [5,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454101562s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632687569s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [3,4,5], acting [2,4,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634292603s) [4,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453979492s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,2], acting [2,4,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632593155s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634172440s) [4,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453979492s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634132385s) [3,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454101562s@ mbc={}] start_peering_interval up [2,4,3] -> [3,5,1], acting [2,4,3] -> [3,5,1], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822861671s) [5,1,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642700195s@ mbc={}] start_peering_interval up [4,0,2] -> [5,1,3], acting [4,0,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634088516s) [3,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454101562s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822648048s) [5,1,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642700195s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631948471s) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452514648s@ mbc={}] start_peering_interval up [2,4,3] -> [4,0,2], acting [2,4,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632093430s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452636719s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631814957s) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452514648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822202682s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642944336s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,3], acting [4,0,2] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631904602s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452636719s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635508537s) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456176758s@ mbc={}] start_peering_interval up [2,4,3] -> [1,2,3], acting [2,4,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635181427s) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.456176758s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822152138s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642944336s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633783340s) [1,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.455078125s@ mbc={}] start_peering_interval up [2,4,3] -> [1,0,2], acting [2,4,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633742332s) [1,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.455078125s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821985245s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643310547s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,0], acting [4,0,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821942329s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643310547s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822620392s) [4,2,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644165039s@ mbc={}] start_peering_interval up [4,0,2] -> [4,2,0], acting [4,0,2] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822558403s) [4,2,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644165039s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632662773s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454467773s@ mbc={}] start_peering_interval up [2,4,3] -> [0,5,1], acting [2,4,3] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.820868492s) [3,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642578125s@ mbc={}] start_peering_interval up [4,0,2] -> [3,2,1], acting [4,0,2] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821425438s) [1,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642944336s@ mbc={}] start_peering_interval up [4,0,2] -> [1,2,3], acting [4,0,2] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630964279s) [1,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452758789s@ mbc={}] start_peering_interval up [2,4,3] -> [1,3,5], acting [2,4,3] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.820826530s) [3,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642578125s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821215630s) [1,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642944336s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632577896s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454467773s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:34 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630919456s) [1,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452758789s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1d( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.c( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [1,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.3( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,5,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [1,3,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.1( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.19( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.3( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.9( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.10( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1d( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.14( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,3,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1f( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.f( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.10( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.11( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.4( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.15( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [3,5,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1f( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.d( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.15( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.16( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.f( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.7( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.c( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.17( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,5,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.f( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.9( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.5( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.a( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.b( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.7( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.5( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.13( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,2,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.11( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.d( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.15( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.8( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,4,0] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.12( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.2( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1d( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.9( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,1,3] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1f( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[5.1a( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.13( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.3( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.b( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.14( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.4( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[5.d( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.6( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:35 np0005626463.localdomain sudo[56177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjfvpedgtvujusxxaqlgqrxjrnetbyqi ; /usr/bin/python3
Feb 23 07:58:35 np0005626463.localdomain sudo[56177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:35 np0005626463.localdomain python3[56179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:35 np0005626463.localdomain sudo[56177]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:35 np0005626463.localdomain sudo[56220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnofmyrvdbclvvvxzosazwpxblfataub ; /usr/bin/python3
Feb 23 07:58:35 np0005626463.localdomain sudo[56220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:36 np0005626463.localdomain python3[56222]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833515.3279996-92678-225510035549802/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=b30b176c5dadfc33fbdfb5fdc77f69e2337fe39c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:36 np0005626463.localdomain sudo[56220]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.783137321s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764770508s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.783046722s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764770508s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780872345s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764038086s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780620575s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764038086s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.782555580s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765991211s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.782437325s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765991211s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780421257s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764526367s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:36 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780347824s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764526367s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:37 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 deep-scrub starts
Feb 23 07:58:38 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts
Feb 23 07:58:39 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Feb 23 07:58:39 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 deep-scrub starts
Feb 23 07:58:40 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub starts
Feb 23 07:58:42 np0005626463.localdomain sudo[56282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqvzykfoyunblwuhuunlgqpcidufjayo ; /usr/bin/python3
Feb 23 07:58:42 np0005626463.localdomain sudo[56282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:42 np0005626463.localdomain python3[56284]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:42 np0005626463.localdomain sudo[56282]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:42 np0005626463.localdomain sudo[56327]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzmyhbqupmgbxnaprdwesqqnpjuiwfge ; /usr/bin/python3
Feb 23 07:58:42 np0005626463.localdomain sudo[56327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:42 np0005626463.localdomain python3[56329]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833521.8651876-92992-123673506454102/source _original_basename=tmpxhp5uara follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:42 np0005626463.localdomain sudo[56327]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:42 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.887340546s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.873168945s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.879220009s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.864990234s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.887211800s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.873168945s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.878021240s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.864257812s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.868871689s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.854614258s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.877858162s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.864257812s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.868235588s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.854614258s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.878250122s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.864990234s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Feb 23 07:58:43 np0005626463.localdomain sudo[56389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aezjmymhxerzqzygmmzieltjemictwls ; /usr/bin/python3
Feb 23 07:58:43 np0005626463.localdomain sudo[56389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:43 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Feb 23 07:58:43 np0005626463.localdomain python3[56391]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:43 np0005626463.localdomain sudo[56389]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:44 np0005626463.localdomain sudo[56432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ataypyjknqltsnfryscahzyddarxzvxj ; /usr/bin/python3
Feb 23 07:58:44 np0005626463.localdomain sudo[56432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:44 np0005626463.localdomain python3[56434]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833523.4632533-93080-128830097371912/source _original_basename=tmpww2g8u89 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:44 np0005626463.localdomain sudo[56432]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:44 np0005626463.localdomain sudo[56462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmujggzevsmzesxepahaqqjxsberdism ; /usr/bin/python3
Feb 23 07:58:44 np0005626463.localdomain sudo[56462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:44 np0005626463.localdomain python3[56464]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Feb 23 07:58:44 np0005626463.localdomain crontab[56465]: (root) LIST (root)
Feb 23 07:58:44 np0005626463.localdomain crontab[56466]: (root) REPLACE (root)
Feb 23 07:58:44 np0005626463.localdomain sudo[56462]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:45 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Feb 23 07:58:45 np0005626463.localdomain sudo[56480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uacxmfbpippfdctyeexdqptmgjvkpnnw ; /usr/bin/python3
Feb 23 07:58:45 np0005626463.localdomain sudo[56480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:45 np0005626463.localdomain python3[56482]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:58:45 np0005626463.localdomain sudo[56480]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:45 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.516323090s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.767578125s@ mbc={}] start_peering_interval up [0,5,4] -> [1,3,2], acting [0,5,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:45 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.514539719s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.765991211s@ mbc={}] start_peering_interval up [0,5,4] -> [1,3,2], acting [0,5,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:45 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.516221046s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.767578125s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:45 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.514459610s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.765991211s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:45 np0005626463.localdomain sudo[56530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owualtlmotchhagwekmntgmivbanhgfd ; /usr/bin/python3
Feb 23 07:58:45 np0005626463.localdomain sudo[56530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:45 np0005626463.localdomain sudo[56530]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:45 np0005626463.localdomain sudo[56548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhnvsasqlflbocqqxyznucwviikqqhdh ; /usr/bin/python3
Feb 23 07:58:45 np0005626463.localdomain sudo[56548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:46 np0005626463.localdomain sudo[56548]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:46 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 49 pg[7.4( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49) [1,3,2] r=2 lpr=49 pi=[41,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:46 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 49 pg[7.c( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49) [1,3,2] r=2 lpr=49 pi=[41,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:46 np0005626463.localdomain sudo[56652]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcqwlkzpwskqffkrpyyappurzfoufxar ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833526.3414202-93440-221100782994957/async_wrapper.py 726138795097 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833526.3414202-93440-221100782994957/AnsiballZ_command.py _
Feb 23 07:58:46 np0005626463.localdomain sudo[56652]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 07:58:46 np0005626463.localdomain ansible-async_wrapper.py[56654]: Invoked with 726138795097 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833526.3414202-93440-221100782994957/AnsiballZ_command.py _
Feb 23 07:58:46 np0005626463.localdomain ansible-async_wrapper.py[56657]: Starting module and watcher
Feb 23 07:58:46 np0005626463.localdomain ansible-async_wrapper.py[56657]: Start watching 56658 (3600)
Feb 23 07:58:46 np0005626463.localdomain ansible-async_wrapper.py[56658]: Start module (56658)
Feb 23 07:58:46 np0005626463.localdomain ansible-async_wrapper.py[56654]: Return async_wrapper task started.
Feb 23 07:58:46 np0005626463.localdomain sudo[56652]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:47 np0005626463.localdomain sudo[56674]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycybvtifylkgpukkdttjdpwctkmwzngr ; /usr/bin/python3
Feb 23 07:58:47 np0005626463.localdomain sudo[56674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:47 np0005626463.localdomain python3[56678]: ansible-ansible.legacy.async_status Invoked with jid=726138795097.56654 mode=status _async_dir=/tmp/.ansible_async
Feb 23 07:58:47 np0005626463.localdomain sudo[56674]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:48 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.d scrub starts
Feb 23 07:58:48 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.d scrub ok
Feb 23 07:58:49 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.b scrub starts
Feb 23 07:58:49 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.b scrub ok
Feb 23 07:58:49 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Feb 23 07:58:49 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]:    (file: /etc/puppet/hiera.yaml)
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Warning: Undefined variable '::deploy_config_name';
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]:    (file & line not available)
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]:    (file & line not available)
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 23 07:58:50 np0005626463.localdomain puppet-user[56676]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.14 seconds
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Notice: Applied catalog in 0.04 seconds
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Application:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:    Initial environment: production
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:    Converged environment: production
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:          Run mode: user
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Changes:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Events:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Resources:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:             Total: 10
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Time:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:          Schedule: 0.00
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:              File: 0.00
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:              Exec: 0.01
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:            Augeas: 0.01
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:    Transaction evaluation: 0.04
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:    Catalog application: 0.04
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:    Config retrieval: 0.18
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:          Last run: 1771833531
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:        Filebucket: 0.00
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:             Total: 0.05
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]: Version:
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:            Config: 1771833530
Feb 23 07:58:51 np0005626463.localdomain puppet-user[56676]:            Puppet: 7.10.0
Feb 23 07:58:51 np0005626463.localdomain ansible-async_wrapper.py[56658]: Module complete (56658)
Feb 23 07:58:51 np0005626463.localdomain ansible-async_wrapper.py[56657]: Done in kid B.
Feb 23 07:58:52 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Feb 23 07:58:52 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.619922638s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1205.873291016s@ mbc={}] start_peering_interval up [0,2,4] -> [2,4,0], acting [0,2,4] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.619922638s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 unknown pruub 1205.873291016s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.609990120s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1205.864257812s@ mbc={}] start_peering_interval up [0,2,4] -> [2,4,0], acting [0,2,4] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:53 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.609990120s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 unknown pruub 1205.864257812s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:58:53 np0005626463.localdomain sudo[56790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:58:53 np0005626463.localdomain sudo[56790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:58:53 np0005626463.localdomain sudo[56790]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:53 np0005626463.localdomain sudo[56805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 07:58:53 np0005626463.localdomain sudo[56805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:58:54 np0005626463.localdomain sudo[56805]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:54 np0005626463.localdomain sudo[56840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:58:54 np0005626463.localdomain sudo[56840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:58:54 np0005626463.localdomain sudo[56840]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:54 np0005626463.localdomain sudo[56855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:58:54 np0005626463.localdomain sudo[56855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:58:54 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 52 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:54 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 52 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Feb 23 07:58:55 np0005626463.localdomain sudo[56855]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:55 np0005626463.localdomain sudo[56901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:58:55 np0005626463.localdomain sudo[56901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:58:55 np0005626463.localdomain sudo[56901]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=45/46 n=2 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.583827972s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1203.042968750s@ mbc={}] start_peering_interval up [4,5,0] -> [1,0,5], acting [4,5,0] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.579640388s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1203.038696289s@ mbc={}] start_peering_interval up [4,5,0] -> [1,0,5], acting [4,5,0] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=45/46 n=2 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.583735466s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1203.042968750s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:55 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.579553604s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1203.038696289s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:58:56 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Feb 23 07:58:56 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Feb 23 07:58:57 np0005626463.localdomain sudo[56929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcyglfftcymgvsclvdqaxafczdipispe ; /usr/bin/python3
Feb 23 07:58:57 np0005626463.localdomain sudo[56929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:58:57 np0005626463.localdomain systemd[1]: tmp-crun.YsNcMR.mount: Deactivated successfully.
Feb 23 07:58:57 np0005626463.localdomain podman[56932]: 2026-02-23 07:58:57.542230134 +0000 UTC m=+0.097080617 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 07:58:57 np0005626463.localdomain python3[56931]: ansible-ansible.legacy.async_status Invoked with jid=726138795097.56654 mode=status _async_dir=/tmp/.ansible_async
Feb 23 07:58:57 np0005626463.localdomain sudo[56929]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:57 np0005626463.localdomain podman[56932]: 2026-02-23 07:58:57.729860551 +0000 UTC m=+0.284710964 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., release=1766032510, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container)
Feb 23 07:58:57 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:58:58 np0005626463.localdomain sudo[56976]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxmlrfczzyfethrfmybdptajmnvqijfc ; /usr/bin/python3
Feb 23 07:58:58 np0005626463.localdomain sudo[56976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:58 np0005626463.localdomain python3[56978]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:58:58 np0005626463.localdomain sudo[56976]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:58 np0005626463.localdomain sudo[56992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrbjtzrexilqiskvqjmxhzbnnjpeymfd ; /usr/bin/python3
Feb 23 07:58:58 np0005626463.localdomain sudo[56992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:58 np0005626463.localdomain python3[56994]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:58:58 np0005626463.localdomain sudo[56992]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:59 np0005626463.localdomain sudo[57042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clrsjjwlvxdyfbugpsjshedmqfvsgxug ; /usr/bin/python3
Feb 23 07:58:59 np0005626463.localdomain sudo[57042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:59 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.9 deep-scrub starts
Feb 23 07:58:59 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.9 deep-scrub ok
Feb 23 07:58:59 np0005626463.localdomain python3[57044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:58:59 np0005626463.localdomain sudo[57042]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:59 np0005626463.localdomain sudo[57060]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-genwmbwquofgpwenhhuygbkeyrcwdwwp ; /usr/bin/python3
Feb 23 07:58:59 np0005626463.localdomain sudo[57060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:59 np0005626463.localdomain python3[57062]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpy_6141th recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 07:58:59 np0005626463.localdomain sudo[57060]: pam_unix(sudo:session): session closed for user root
Feb 23 07:58:59 np0005626463.localdomain sudo[57090]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfvoeytcpbozuyirxvuxmhmuevxqqasn ; /usr/bin/python3
Feb 23 07:58:59 np0005626463.localdomain sudo[57090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:58:59 np0005626463.localdomain python3[57092]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:58:59 np0005626463.localdomain sudo[57090]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:00 np0005626463.localdomain sudo[57106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuunajyohrlxlsoskaewwkbdgidwddwx ; /usr/bin/python3
Feb 23 07:59:00 np0005626463.localdomain sudo[57106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:00 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Feb 23 07:59:00 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Feb 23 07:59:00 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Feb 23 07:59:00 np0005626463.localdomain sudo[57106]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:00 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Feb 23 07:59:01 np0005626463.localdomain sudo[57193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdnvcfandubgmxkcirfysfeahrxdirup ; /usr/bin/python3
Feb 23 07:59:01 np0005626463.localdomain sudo[57193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:01 np0005626463.localdomain python3[57195]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 23 07:59:01 np0005626463.localdomain sudo[57193]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4315 writes, 20K keys, 4315 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4315 writes, 358 syncs, 12.05 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1058 writes, 3920 keys, 1058 commit groups, 1.0 writes per commit group, ingest: 1.69 MB, 0.00 MB/s
                                                          Interval WAL: 1058 writes, 214 syncs, 4.94 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:59:01 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Feb 23 07:59:01 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Feb 23 07:59:01 np0005626463.localdomain sudo[57212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmpgungctbuwcmxnqlevdezcylehskxs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:01 np0005626463.localdomain sudo[57212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:01 np0005626463.localdomain python3[57214]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:01 np0005626463.localdomain sudo[57212]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:02 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Feb 23 07:59:02 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Feb 23 07:59:02 np0005626463.localdomain sudo[57228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irofvuhdqfwkrczcwbzxwfeoberboxvd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:02 np0005626463.localdomain sudo[57228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:02 np0005626463.localdomain sudo[57228]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:02 np0005626463.localdomain sudo[57244]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ladnwvmvuzujwffztzwooumbuxyyrtlm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:02 np0005626463.localdomain sudo[57244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:02 np0005626463.localdomain python3[57246]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 07:59:02 np0005626463.localdomain sudo[57244]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:03 np0005626463.localdomain sudo[57294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysawtyhmujyecmjxcdvipalstpynyxnx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:03 np0005626463.localdomain sudo[57294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:03 np0005626463.localdomain python3[57296]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:03 np0005626463.localdomain sudo[57294]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:03 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.910213470s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1215.075073242s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:03 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.910005569s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1215.075073242s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:03 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.909722328s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1215.074951172s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:03 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.909592628s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1215.074951172s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:03 np0005626463.localdomain sudo[57312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvlhoywgjwfwvdbmsblwcnwaxpkuuqah ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:03 np0005626463.localdomain sudo[57312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:03 np0005626463.localdomain python3[57314]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:03 np0005626463.localdomain sudo[57312]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Feb 23 07:59:04 np0005626463.localdomain sudo[57374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otywcfqgxhoisyedqfirpkijhbswhjbf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:04 np0005626463.localdomain sudo[57374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Feb 23 07:59:04 np0005626463.localdomain python3[57376]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:04 np0005626463.localdomain sudo[57374]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:04 np0005626463.localdomain sudo[57392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypaddmedapidfpgpmbxtkidgfefhufzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:04 np0005626463.localdomain sudo[57392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Feb 23 07:59:04 np0005626463.localdomain python3[57394]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:04 np0005626463.localdomain sudo[57392]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 55 pg[7.7( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55) [3,5,1] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:04 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 55 pg[7.f( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55) [3,5,1] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:04 np0005626463.localdomain sudo[57454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xajzomxioshxajehpgkhtdnqyaiqtrtw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:04 np0005626463.localdomain sudo[57454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:05 np0005626463.localdomain python3[57456]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:05 np0005626463.localdomain sudo[57454]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:05 np0005626463.localdomain sudo[57472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucoanxatoyvopheavjbrficbxmteiwgt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:05 np0005626463.localdomain sudo[57472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:05 np0005626463.localdomain python3[57474]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:05 np0005626463.localdomain sudo[57472]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:05 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 57 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=57 pruub=13.205279350s) [1,0,5] r=2 lpr=57 pi=[41,57)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1212.765014648s@ mbc={}] start_peering_interval up [0,5,4] -> [1,0,5], acting [0,5,4] -> [1,0,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:05 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 57 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=57 pruub=13.205115318s) [1,0,5] r=2 lpr=57 pi=[41,57)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1212.765014648s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:05 np0005626463.localdomain sudo[57534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsryzuelwhhcsugodaexugauunqftppp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:05 np0005626463.localdomain sudo[57534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:06 np0005626463.localdomain python3[57536]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:06 np0005626463.localdomain sudo[57534]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:06 np0005626463.localdomain sudo[57552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vedrkvhsxljtytpbezbbmonijubsqjbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:06 np0005626463.localdomain sudo[57552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:06 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Feb 23 07:59:06 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Feb 23 07:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 07:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4867 writes, 22K keys, 4867 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4867 writes, 489 syncs, 9.95 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1482 writes, 5427 keys, 1482 commit groups, 1.0 writes per commit group, ingest: 2.10 MB, 0.00 MB/s
                                                          Interval WAL: 1482 writes, 293 syncs, 5.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 07:59:06 np0005626463.localdomain python3[57554]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:06 np0005626463.localdomain sudo[57552]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:06 np0005626463.localdomain sshd[57569]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:59:06 np0005626463.localdomain sudo[57584]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhbedcenwncdvvxyglqrurcnpsviwdxb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:06 np0005626463.localdomain sudo[57584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:06 np0005626463.localdomain python3[57586]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:59:06 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:59:06 np0005626463.localdomain sshd[57569]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:59:06 np0005626463.localdomain systemd-sysv-generator[57616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:59:06 np0005626463.localdomain systemd-rc-local-generator[57610]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:59:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:59:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Feb 23 07:59:07 np0005626463.localdomain sudo[57584]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:07 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Feb 23 07:59:07 np0005626463.localdomain sudo[57670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpiwcmqtdgdqgndrxildeejsocordgsw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:07 np0005626463.localdomain sudo[57670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:07 np0005626463.localdomain python3[57672]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:07 np0005626463.localdomain sudo[57670]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:07 np0005626463.localdomain sudo[57688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txdcmfekgasoqcuckibphepasjgtomjz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:07 np0005626463.localdomain sudo[57688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:08 np0005626463.localdomain python3[57690]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:08 np0005626463.localdomain sudo[57688]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:08 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Feb 23 07:59:08 np0005626463.localdomain ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Feb 23 07:59:08 np0005626463.localdomain sudo[57750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eidaexnzdbtjwuvppybguqyfcunmmixr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:08 np0005626463.localdomain sudo[57750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:08 np0005626463.localdomain python3[57752]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 07:59:08 np0005626463.localdomain sudo[57750]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:08 np0005626463.localdomain sudo[57768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofrtihythzrhavlapwqkaxbuzhbztiyl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:08 np0005626463.localdomain sudo[57768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:08 np0005626463.localdomain python3[57770]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:08 np0005626463.localdomain sudo[57768]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:09 np0005626463.localdomain sudo[57798]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvgndxbsaqnflqgahamngliihhthbqyk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:09 np0005626463.localdomain sudo[57798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:09 np0005626463.localdomain python3[57800]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 07:59:09 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Feb 23 07:59:09 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Feb 23 07:59:09 np0005626463.localdomain systemd-rc-local-generator[57822]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 07:59:09 np0005626463.localdomain systemd-sysv-generator[57828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 07:59:09 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 07:59:09 np0005626463.localdomain sudo[57798]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:10 np0005626463.localdomain sudo[57855]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abozgimzrcksnlameeebkyszhvzbfnbf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:10 np0005626463.localdomain sudo[57855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:10 np0005626463.localdomain python3[57857]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 07:59:10 np0005626463.localdomain sudo[57855]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:10 np0005626463.localdomain sudo[57871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rupixxupfrvydrsptukxeszfnnocgwwp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:10 np0005626463.localdomain sudo[57871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:11 np0005626463.localdomain sudo[57871]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:11 np0005626463.localdomain sudo[57914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmicailehapjrmobvdwzbshssbtfphce ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 07:59:11 np0005626463.localdomain sudo[57914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:11 np0005626463.localdomain python3[57916]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 07:59:12 np0005626463.localdomain podman[57993]: 2026-02-23 07:59:12.241465965 +0000 UTC m=+0.075517733 container create bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_compute_init_log, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step2, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 23 07:59:12 np0005626463.localdomain podman[57994]: 2026-02-23 07:59:12.277943745 +0000 UTC m=+0.102908619 container create cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope.
Feb 23 07:59:12 np0005626463.localdomain podman[57993]: 2026-02-23 07:59:12.201190866 +0000 UTC m=+0.035242614 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope.
Feb 23 07:59:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 07:59:12 np0005626463.localdomain podman[57994]: 2026-02-23 07:59:12.228451217 +0000 UTC m=+0.053416081 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 07:59:12 np0005626463.localdomain podman[57993]: 2026-02-23 07:59:12.330281252 +0000 UTC m=+0.164333020 container init bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_compute_init_log)
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:59:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 23 07:59:12 np0005626463.localdomain podman[57993]: 2026-02-23 07:59:12.337014882 +0000 UTC m=+0.171066660 container start bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step2, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_compute_init_log, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: libpod-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope: Deactivated successfully.
Feb 23 07:59:12 np0005626463.localdomain python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Feb 23 07:59:12 np0005626463.localdomain podman[57994]: 2026-02-23 07:59:12.346957234 +0000 UTC m=+0.171922098 container init cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, container_name=nova_virtqemud_init_logs, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 07:59:12 np0005626463.localdomain podman[57994]: 2026-02-23 07:59:12.354779907 +0000 UTC m=+0.179744771 container start cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud_init_logs, build-date=2026-01-12T23:31:49Z, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 07:59:12 np0005626463.localdomain python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: libpod-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope: Deactivated successfully.
Feb 23 07:59:12 np0005626463.localdomain podman[58044]: 2026-02-23 07:59:12.41497266 +0000 UTC m=+0.044144382 container died cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, container_name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 07:59:12 np0005626463.localdomain podman[58030]: 2026-02-23 07:59:12.474961755 +0000 UTC m=+0.116297397 container died bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, release=1766032510, container_name=nova_compute_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 07:59:12 np0005626463.localdomain podman[58050]: 2026-02-23 07:59:12.517138854 +0000 UTC m=+0.135297321 container cleanup cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, architecture=x86_64, url=https://www.redhat.com, container_name=nova_virtqemud_init_logs, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: libpod-conmon-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope: Deactivated successfully.
Feb 23 07:59:12 np0005626463.localdomain podman[58032]: 2026-02-23 07:59:12.648234494 +0000 UTC m=+0.291623140 container cleanup bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, container_name=nova_compute_init_log, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: libpod-conmon-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope: Deactivated successfully.
Feb 23 07:59:12 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:12.929563891 +0000 UTC m=+0.100940707 container create 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=create_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 23 07:59:12 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:12.949534655 +0000 UTC m=+0.113799239 container create c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_id=tripleo_step2, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, batch=17.1_20260112.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Feb 23 07:59:12 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:12.878299047 +0000 UTC m=+0.049675883 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope.
Feb 23 07:59:12 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:12.888234268 +0000 UTC m=+0.052498852 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope.
Feb 23 07:59:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:59:13 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 07:59:13 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 23 07:59:13 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 07:59:13 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:13.012726271 +0000 UTC m=+0.184103097 container init 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 23 07:59:13 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:13.01526222 +0000 UTC m=+0.179526804 container init c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step2, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 07:59:13 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:13.023220089 +0000 UTC m=+0.194596915 container start 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step2, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container)
Feb 23 07:59:13 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:13.023515028 +0000 UTC m=+0.194891854 container attach 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=)
Feb 23 07:59:13 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:13.075012488 +0000 UTC m=+0.239277072 container start c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, version=17.1.13, container_name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 07:59:13 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:13.075489844 +0000 UTC m=+0.239754478 container attach c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step2)
Feb 23 07:59:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully.
Feb 23 07:59:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64-userdata-shm.mount: Deactivated successfully.
Feb 23 07:59:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a-merged.mount: Deactivated successfully.
Feb 23 07:59:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b-userdata-shm.mount: Deactivated successfully.
Feb 23 07:59:13 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 59 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=9.269790649s) [4,2,3] r=1 lpr=59 pi=[43,59)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1221.865112305s@ mbc={}] start_peering_interval up [0,2,4] -> [4,2,3], acting [0,2,4] -> [4,2,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:13 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 59 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=9.269694328s) [4,2,3] r=1 lpr=59 pi=[43,59)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.865112305s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:14 np0005626463.localdomain ovs-vsctl[58282]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: libpod-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: libpod-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Consumed 2.189s CPU time.
Feb 23 07:59:15 np0005626463.localdomain podman[58178]: 2026-02-23 07:59:15.209158951 +0000 UTC m=+2.380535797 container died 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: tmp-crun.lHo5ud.mount: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4-userdata-shm.mount: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59-merged.mount: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain podman[58431]: 2026-02-23 07:59:15.326521691 +0000 UTC m=+0.105323774 container cleanup 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step2, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: libpod-conmon-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Feb 23 07:59:15 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Feb 23 07:59:15 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: libpod-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Deactivated successfully.
Feb 23 07:59:15 np0005626463.localdomain systemd[1]: libpod-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Consumed 2.269s CPU time.
Feb 23 07:59:15 np0005626463.localdomain podman[58179]: 2026-02-23 07:59:15.954650392 +0000 UTC m=+3.118914976 container died c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 23 07:59:16 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 61 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=9.284608841s) [2,4,3] r=-1 lpr=61 pi=[45,61)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1219.043090820s@ mbc={}] start_peering_interval up [4,5,0] -> [2,4,3], acting [4,5,0] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:16 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 61 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=9.284519196s) [2,4,3] r=-1 lpr=61 pi=[45,61)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.043090820s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:16 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 61 pg[7.a( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61) [2,4,3] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 23 07:59:16 np0005626463.localdomain podman[58471]: 2026-02-23 07:59:16.062501905 +0000 UTC m=+0.092565356 container cleanup c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, release=1766032510, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 23 07:59:16 np0005626463.localdomain systemd[1]: libpod-conmon-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Deactivated successfully.
Feb 23 07:59:16 np0005626463.localdomain python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Feb 23 07:59:16 np0005626463.localdomain sudo[57914]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully.
Feb 23 07:59:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36-userdata-shm.mount: Deactivated successfully.
Feb 23 07:59:16 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Feb 23 07:59:16 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Feb 23 07:59:16 np0005626463.localdomain sudo[58525]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmiuxvvxlfzgaydmuwiputsgmclbzxai ; /usr/bin/python3
Feb 23 07:59:16 np0005626463.localdomain sudo[58525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:16 np0005626463.localdomain python3[58527]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:16 np0005626463.localdomain sudo[58525]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:17 np0005626463.localdomain sudo[58573]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uphtviqyxtwygrjtrvplfteqlrcdtkxa ; /usr/bin/python3
Feb 23 07:59:17 np0005626463.localdomain sudo[58573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:17 np0005626463.localdomain sudo[58573]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:17 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 62 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=61/62 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61) [2,4,3] r=0 lpr=61 pi=[45,61)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:59:17 np0005626463.localdomain sudo[58616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyrvchkoumdfmjzfxnbdkoallrcxivcm ; /usr/bin/python3
Feb 23 07:59:17 np0005626463.localdomain sudo[58616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:17 np0005626463.localdomain sudo[58616]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:18 np0005626463.localdomain sudo[58646]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpprjmxpdwfjwqcwfptdmyqzspktqvsi ; /usr/bin/python3
Feb 23 07:59:18 np0005626463.localdomain sudo[58646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:18 np0005626463.localdomain python3[58648]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005626463 step=2 update_config_hash_only=False
Feb 23 07:59:18 np0005626463.localdomain sudo[58646]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:18 np0005626463.localdomain sudo[58662]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paxaemkzkwjvheghzkzltrwxecpwwxcy ; /usr/bin/python3
Feb 23 07:59:18 np0005626463.localdomain sudo[58662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:18 np0005626463.localdomain python3[58664]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 07:59:18 np0005626463.localdomain sudo[58662]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:19 np0005626463.localdomain sudo[58678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgjwjqdzwggoxxmjnflxstxlpmwzarmb ; /usr/bin/python3
Feb 23 07:59:19 np0005626463.localdomain sudo[58678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 07:59:19 np0005626463.localdomain python3[58680]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 07:59:19 np0005626463.localdomain sudo[58678]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:23 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1c deep-scrub starts
Feb 23 07:59:23 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1c deep-scrub ok
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 64 crush map has features 432629239337189376, adjusting msgr requires for clients
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 64 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 64 crush map has features 3314933000854323200, adjusting msgr requires for osds
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 64 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.048739433s) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active pruub 1237.859008789s@ mbc={}] start_peering_interval up [2,0,1] -> [2,0,4], acting [2,0,1] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 64 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.048739433s) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 unknown pruub 1237.859008789s@ mbc={}] state<Start>: transitioning to Primary
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 64 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=49/50 n=1 ec=41/34 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=10.373780251s) [0,1,2] r=2 lpr=64 pi=[49,64)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1233.184448242s@ mbc={}] start_peering_interval up [1,3,2] -> [0,1,2], acting [1,3,2] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 64 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=49/50 n=1 ec=41/34 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=10.373656273s) [0,1,2] r=2 lpr=64 pi=[49,64)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1233.184448242s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[32575]: osd.5 64 crush map has features 432629239337189376, adjusting msgr requires for clients
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[32575]: osd.5 64 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[32575]: osd.5 64 crush map has features 3314933000854323200, adjusting msgr requires for osds
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 64 pg[4.1b( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.022865295s) [3,4,5] r=2 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active pruub 1232.983154297s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,5], acting [3,1,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 64 pg[4.1b( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.022788048s) [3,4,5] r=2 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1232.983154297s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.d scrub starts
Feb 23 07:59:24 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.d scrub ok
Feb 23 07:59:25 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 65 pg[4.1( empty local-lis/les=64/65 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 23 07:59:26 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 66 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=8.429811478s) [3,4,5] r=-1 lpr=66 pi=[51,66)/1 crt=36'39 mlcod 0'0 active pruub 1233.304321289s@ mbc={}] start_peering_interval up [2,4,0] -> [3,4,5], acting [2,4,0] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:26 np0005626463.localdomain ceph-osd[31633]: osd.2 pg_epoch: 66 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=8.429599762s) [3,4,5] r=-1 lpr=66 pi=[51,66)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1233.304321289s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:27 np0005626463.localdomain sshd[58681]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:59:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:59:27 np0005626463.localdomain podman[58682]: 2026-02-23 07:59:27.92364386 +0000 UTC m=+0.090258253 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=)
Feb 23 07:59:28 np0005626463.localdomain podman[58682]: 2026-02-23 07:59:28.141993688 +0000 UTC m=+0.308608051 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Feb 23 07:59:28 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 07:59:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 66 pg[7.d( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66) [3,4,5] r=2 lpr=66 pi=[51,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 68 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=53/54 n=1 ec=41/34 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=8.680847168s) [1,5,3] r=1 lpr=68 pi=[53,68)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1230.706176758s@ mbc={}] start_peering_interval up [1,0,5] -> [1,5,3], acting [1,0,5] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:28 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 68 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=53/54 n=1 ec=41/34 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=8.680780411s) [1,5,3] r=1 lpr=68 pi=[53,68)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1230.706176758s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:28 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.5 deep-scrub starts
Feb 23 07:59:28 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.5 deep-scrub ok
Feb 23 07:59:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 69 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=55/56 n=1 ec=41/34 lis/c=55/55 les/c/f=56/56/0 sis=69 pruub=15.563361168s) [1,5,3] r=1 lpr=69 pi=[55,69)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1238.626098633s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 23 07:59:29 np0005626463.localdomain ceph-osd[32575]: osd.5 pg_epoch: 69 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=55/56 n=1 ec=41/34 lis/c=55/55 les/c/f=56/56/0 sis=69 pruub=15.563264847s) [1,5,3] r=1 lpr=69 pi=[55,69)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1238.626098633s@ mbc={}] state<Start>: transitioning to Stray
Feb 23 07:59:29 np0005626463.localdomain sshd[58681]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:59:30 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.a scrub starts
Feb 23 07:59:30 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.a scrub ok
Feb 23 07:59:33 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Feb 23 07:59:33 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Feb 23 07:59:37 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub starts
Feb 23 07:59:37 np0005626463.localdomain ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub ok
Feb 23 07:59:45 np0005626463.localdomain sshd[58711]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 07:59:46 np0005626463.localdomain sshd[58711]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 07:59:55 np0005626463.localdomain sudo[58713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 07:59:55 np0005626463.localdomain sudo[58713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:59:55 np0005626463.localdomain sudo[58713]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:56 np0005626463.localdomain sudo[58728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 07:59:56 np0005626463.localdomain sudo[58728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:59:56 np0005626463.localdomain sudo[58728]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:57 np0005626463.localdomain sudo[58774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 07:59:57 np0005626463.localdomain sudo[58774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 07:59:57 np0005626463.localdomain sudo[58774]: pam_unix(sudo:session): session closed for user root
Feb 23 07:59:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 07:59:58 np0005626463.localdomain systemd[1]: tmp-crun.rILy3a.mount: Deactivated successfully.
Feb 23 07:59:58 np0005626463.localdomain podman[58789]: 2026-02-23 07:59:58.914831452 +0000 UTC m=+0.091897135 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 07:59:59 np0005626463.localdomain podman[58789]: 2026-02-23 07:59:59.113222855 +0000 UTC m=+0.290288518 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public)
Feb 23 07:59:59 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:00:24 np0005626463.localdomain sshd[58819]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:00:27 np0005626463.localdomain sshd[58821]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:00:27 np0005626463.localdomain sshd[58821]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:00:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:00:29 np0005626463.localdomain systemd[1]: tmp-crun.z28wnv.mount: Deactivated successfully.
Feb 23 08:00:29 np0005626463.localdomain podman[58823]: 2026-02-23 08:00:29.922586569 +0000 UTC m=+0.093597371 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:00:30 np0005626463.localdomain podman[58823]: 2026-02-23 08:00:30.113389302 +0000 UTC m=+0.284400104 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:00:30 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:00:32 np0005626463.localdomain sshd[58819]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:00:57 np0005626463.localdomain sudo[58854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:00:57 np0005626463.localdomain sudo[58854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:00:57 np0005626463.localdomain sudo[58854]: pam_unix(sudo:session): session closed for user root
Feb 23 08:00:57 np0005626463.localdomain sudo[58869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:00:57 np0005626463.localdomain sudo[58869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:00:58 np0005626463.localdomain podman[58953]: 2026-02-23 08:00:58.422364951 +0000 UTC m=+0.106662709 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main)
Feb 23 08:00:58 np0005626463.localdomain podman[58953]: 2026-02-23 08:00:58.528400973 +0000 UTC m=+0.212698731 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.42.2, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Feb 23 08:00:58 np0005626463.localdomain sudo[58869]: pam_unix(sudo:session): session closed for user root
Feb 23 08:00:58 np0005626463.localdomain sudo[59017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:00:58 np0005626463.localdomain sudo[59017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:00:58 np0005626463.localdomain sudo[59017]: pam_unix(sudo:session): session closed for user root
Feb 23 08:00:59 np0005626463.localdomain sudo[59032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:00:59 np0005626463.localdomain sudo[59032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:00:59 np0005626463.localdomain sudo[59032]: pam_unix(sudo:session): session closed for user root
Feb 23 08:01:00 np0005626463.localdomain sudo[59080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:01:00 np0005626463.localdomain sudo[59080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:01:00 np0005626463.localdomain sudo[59080]: pam_unix(sudo:session): session closed for user root
Feb 23 08:01:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:01:00 np0005626463.localdomain podman[59095]: 2026-02-23 08:01:00.920347129 +0000 UTC m=+0.092793200 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:01:01 np0005626463.localdomain podman[59095]: 2026-02-23 08:01:01.112687634 +0000 UTC m=+0.285133665 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:01:01 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:01:01 np0005626463.localdomain CROND[59124]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 08:01:01 np0005626463.localdomain run-parts[59127]: (/etc/cron.hourly) starting 0anacron
Feb 23 08:01:01 np0005626463.localdomain run-parts[59133]: (/etc/cron.hourly) finished 0anacron
Feb 23 08:01:01 np0005626463.localdomain CROND[59123]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 08:01:08 np0005626463.localdomain sshd[59134]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:01:09 np0005626463.localdomain sshd[59134]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:01:21 np0005626463.localdomain sshd[59136]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:01:22 np0005626463.localdomain sshd[59136]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:01:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:01:31 np0005626463.localdomain podman[59138]: 2026-02-23 08:01:31.914124359 +0000 UTC m=+0.084158226 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 23 08:01:32 np0005626463.localdomain podman[59138]: 2026-02-23 08:01:32.105110502 +0000 UTC m=+0.275144769 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:01:32 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:01:50 np0005626463.localdomain sshd[59167]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:01:51 np0005626463.localdomain sshd[59167]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:02:00 np0005626463.localdomain sudo[59169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:02:00 np0005626463.localdomain sudo[59169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:02:00 np0005626463.localdomain sudo[59169]: pam_unix(sudo:session): session closed for user root
Feb 23 08:02:00 np0005626463.localdomain sudo[59184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:02:00 np0005626463.localdomain sudo[59184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:02:01 np0005626463.localdomain sudo[59184]: pam_unix(sudo:session): session closed for user root
Feb 23 08:02:01 np0005626463.localdomain sudo[59231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:02:01 np0005626463.localdomain sudo[59231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:02:01 np0005626463.localdomain sudo[59231]: pam_unix(sudo:session): session closed for user root
Feb 23 08:02:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:02:02 np0005626463.localdomain systemd[1]: tmp-crun.Nz7jYf.mount: Deactivated successfully.
Feb 23 08:02:02 np0005626463.localdomain podman[59246]: 2026-02-23 08:02:02.919152813 +0000 UTC m=+0.093362617 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com)
Feb 23 08:02:03 np0005626463.localdomain podman[59246]: 2026-02-23 08:02:03.163258592 +0000 UTC m=+0.337468366 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:02:03 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:02:14 np0005626463.localdomain sshd[59276]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:02:16 np0005626463.localdomain sshd[59276]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:02:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:02:33 np0005626463.localdomain podman[59278]: 2026-02-23 08:02:33.904899982 +0000 UTC m=+0.080425265 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:02:34 np0005626463.localdomain podman[59278]: 2026-02-23 08:02:34.095455404 +0000 UTC m=+0.270980677 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z)
Feb 23 08:02:34 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:02:34 np0005626463.localdomain sshd[59307]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:02:34 np0005626463.localdomain sshd[59307]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:03:01 np0005626463.localdomain sudo[59309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:03:01 np0005626463.localdomain sudo[59309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:03:01 np0005626463.localdomain sudo[59309]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:01 np0005626463.localdomain sudo[59324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:03:01 np0005626463.localdomain sudo[59324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:03:02 np0005626463.localdomain sudo[59324]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:03 np0005626463.localdomain sudo[59371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:03:03 np0005626463.localdomain sudo[59371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:03:03 np0005626463.localdomain sudo[59371]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:03:04 np0005626463.localdomain systemd[1]: tmp-crun.3iAM2K.mount: Deactivated successfully.
Feb 23 08:03:04 np0005626463.localdomain podman[59386]: 2026-02-23 08:03:04.908657731 +0000 UTC m=+0.080516637 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:03:05 np0005626463.localdomain podman[59386]: 2026-02-23 08:03:05.097354915 +0000 UTC m=+0.269213811 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:03:05 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:03:09 np0005626463.localdomain sshd[59415]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:03:13 np0005626463.localdomain sshd[59415]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:03:17 np0005626463.localdomain sshd[59417]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:03:17 np0005626463.localdomain sshd[59417]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:03:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=49960 SEQ=0 ACK=3124770151 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:03:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:03:35 np0005626463.localdomain systemd[1]: tmp-crun.yanLUg.mount: Deactivated successfully.
Feb 23 08:03:35 np0005626463.localdomain podman[59419]: 2026-02-23 08:03:35.909360466 +0000 UTC m=+0.084747059 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true)
Feb 23 08:03:36 np0005626463.localdomain podman[59419]: 2026-02-23 08:03:36.104061848 +0000 UTC m=+0.279448461 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:03:36 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:03:55 np0005626463.localdomain sshd[59448]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:03:56 np0005626463.localdomain sudo[59495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phwkdvpyqcqjovyjvfqmyeypydnzmrzv ; /usr/bin/python3
Feb 23 08:03:56 np0005626463.localdomain sudo[59495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:03:56 np0005626463.localdomain python3[59497]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:03:56 np0005626463.localdomain sudo[59495]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:57 np0005626463.localdomain sudo[59540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-easgyevohuirccgkqnwggnmmmkwuvmkt ; /usr/bin/python3
Feb 23 08:03:57 np0005626463.localdomain sudo[59540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:03:57 np0005626463.localdomain sshd[59448]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:03:57 np0005626463.localdomain python3[59542]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833836.58139-99598-278474677941260/source _original_basename=tmp9owi_7bf follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:03:57 np0005626463.localdomain sudo[59540]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:58 np0005626463.localdomain sudo[59570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqeuhuylwrxckqlkpasvczihxwighdtk ; /usr/bin/python3
Feb 23 08:03:58 np0005626463.localdomain sudo[59570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:03:58 np0005626463.localdomain python3[59572]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:03:58 np0005626463.localdomain sudo[59570]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:59 np0005626463.localdomain sudo[59620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcdhyxcsfclmyrmalponscuxjerrbjoq ; /usr/bin/python3
Feb 23 08:03:59 np0005626463.localdomain sudo[59620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:03:59 np0005626463.localdomain sshd[59623]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:03:59 np0005626463.localdomain sudo[59620]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:59 np0005626463.localdomain sudo[59640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deyhmqgwcehxlznnwxpxuqymredsjnmg ; /usr/bin/python3
Feb 23 08:03:59 np0005626463.localdomain sudo[59640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:03:59 np0005626463.localdomain sudo[59640]: pam_unix(sudo:session): session closed for user root
Feb 23 08:03:59 np0005626463.localdomain sshd[59623]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:04:00 np0005626463.localdomain sudo[59744]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prlshdpkbvzoglbhlhesgctrxleguuhn ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833839.8767023-99778-228112044019917/async_wrapper.py 839121906300 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833839.8767023-99778-228112044019917/AnsiballZ_command.py _
Feb 23 08:04:00 np0005626463.localdomain sudo[59744]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 08:04:00 np0005626463.localdomain ansible-async_wrapper.py[59746]: Invoked with 839121906300 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833839.8767023-99778-228112044019917/AnsiballZ_command.py _
Feb 23 08:04:00 np0005626463.localdomain ansible-async_wrapper.py[59749]: Starting module and watcher
Feb 23 08:04:00 np0005626463.localdomain ansible-async_wrapper.py[59749]: Start watching 59750 (3600)
Feb 23 08:04:00 np0005626463.localdomain ansible-async_wrapper.py[59750]: Start module (59750)
Feb 23 08:04:00 np0005626463.localdomain ansible-async_wrapper.py[59746]: Return async_wrapper task started.
Feb 23 08:04:00 np0005626463.localdomain sudo[59744]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:00 np0005626463.localdomain sudo[59768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evogrkubxerfcqaqwpakutphsnjlfnfh ; /usr/bin/python3
Feb 23 08:04:00 np0005626463.localdomain sudo[59768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:00 np0005626463.localdomain python3[59770]: ansible-ansible.legacy.async_status Invoked with jid=839121906300.59746 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:04:00 np0005626463.localdomain sudo[59768]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:03 np0005626463.localdomain sudo[59881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:04:03 np0005626463.localdomain sudo[59881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:04:03 np0005626463.localdomain sudo[59881]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:03 np0005626463.localdomain sudo[59896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:04:03 np0005626463.localdomain sudo[59896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:04:03 np0005626463.localdomain sudo[59896]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:03 np0005626463.localdomain puppet-user[59754]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 08:04:03 np0005626463.localdomain puppet-user[59754]:    (file: /etc/puppet/hiera.yaml)
Feb 23 08:04:03 np0005626463.localdomain puppet-user[59754]: Warning: Undefined variable '::deploy_config_name';
Feb 23 08:04:03 np0005626463.localdomain puppet-user[59754]:    (file & line not available)
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    (file & line not available)
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.13 seconds
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Notice: Applied catalog in 0.05 seconds
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Application:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    Initial environment: production
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    Converged environment: production
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:          Run mode: user
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Changes:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Events:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Resources:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:             Total: 10
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Time:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:          Schedule: 0.00
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:              File: 0.00
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:            Augeas: 0.01
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:              Exec: 0.01
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    Transaction evaluation: 0.04
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    Catalog application: 0.05
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:    Config retrieval: 0.17
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:          Last run: 1771833844
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:        Filebucket: 0.00
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:             Total: 0.06
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]: Version:
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:            Config: 1771833843
Feb 23 08:04:04 np0005626463.localdomain puppet-user[59754]:            Puppet: 7.10.0
Feb 23 08:04:04 np0005626463.localdomain ansible-async_wrapper.py[59750]: Module complete (59750)
Feb 23 08:04:04 np0005626463.localdomain sudo[59943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:04:04 np0005626463.localdomain sudo[59943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:04:04 np0005626463.localdomain sudo[59943]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:05 np0005626463.localdomain ansible-async_wrapper.py[59749]: Done in kid B.
Feb 23 08:04:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:04:06 np0005626463.localdomain podman[59958]: 2026-02-23 08:04:06.921104596 +0000 UTC m=+0.090164138 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, distribution-scope=public, vcs-type=git)
Feb 23 08:04:07 np0005626463.localdomain podman[59958]: 2026-02-23 08:04:07.157025711 +0000 UTC m=+0.326085243 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:04:07 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:04:11 np0005626463.localdomain sudo[60000]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdqyuzudrrswexbmrshbwcojwxsajvvz ; /usr/bin/python3
Feb 23 08:04:11 np0005626463.localdomain sudo[60000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:11 np0005626463.localdomain python3[60002]: ansible-ansible.legacy.async_status Invoked with jid=839121906300.59746 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:04:11 np0005626463.localdomain sudo[60000]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:11 np0005626463.localdomain sudo[60016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuhabqhtsqjjgbzsirffqrzbhkwxppkg ; /usr/bin/python3
Feb 23 08:04:11 np0005626463.localdomain sudo[60016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:11 np0005626463.localdomain python3[60018]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:04:11 np0005626463.localdomain sudo[60016]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:12 np0005626463.localdomain sudo[60032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vniyczbaqgrihthzxziennmmgufisiah ; /usr/bin/python3
Feb 23 08:04:12 np0005626463.localdomain sudo[60032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:12 np0005626463.localdomain python3[60034]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:12 np0005626463.localdomain sudo[60032]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:12 np0005626463.localdomain sudo[60082]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkhiwrfwqckpmayluiokjdetqjmmbvvv ; /usr/bin/python3
Feb 23 08:04:12 np0005626463.localdomain sudo[60082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:12 np0005626463.localdomain python3[60084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:12 np0005626463.localdomain sudo[60082]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:13 np0005626463.localdomain sudo[60100]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhtienxlchxucuevuqlgqkbbrqrydcjz ; /usr/bin/python3
Feb 23 08:04:13 np0005626463.localdomain sudo[60100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:13 np0005626463.localdomain python3[60102]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpfm4zvh7f recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:04:13 np0005626463.localdomain sudo[60100]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:13 np0005626463.localdomain sudo[60130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjixcpamtprotkdfihpwiyzetqoadolq ; /usr/bin/python3
Feb 23 08:04:13 np0005626463.localdomain sudo[60130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:13 np0005626463.localdomain python3[60132]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:13 np0005626463.localdomain sudo[60130]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:13 np0005626463.localdomain sudo[60146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajmwoqlrygfngfduvhnnwcfffrquopdb ; /usr/bin/python3
Feb 23 08:04:13 np0005626463.localdomain sudo[60146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:14 np0005626463.localdomain sudo[60146]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:14 np0005626463.localdomain sudo[60234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxgafcnkhshxhicdleqquvgbefthbdgd ; /usr/bin/python3
Feb 23 08:04:14 np0005626463.localdomain sudo[60234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:14 np0005626463.localdomain python3[60236]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 23 08:04:14 np0005626463.localdomain sudo[60234]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:15 np0005626463.localdomain sudo[60253]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atpkwyqecwfzfsnjbvniudufygcgorox ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:15 np0005626463.localdomain sudo[60253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:15 np0005626463.localdomain python3[60255]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:15 np0005626463.localdomain sudo[60253]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:16 np0005626463.localdomain sudo[60269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvjqqncqofyxjzffnxanwohesitdewwi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:16 np0005626463.localdomain sudo[60269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:16 np0005626463.localdomain sudo[60269]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:16 np0005626463.localdomain sudo[60285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntnijbwyzalwiaievsfooyweyhbqullt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:16 np0005626463.localdomain sudo[60285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:16 np0005626463.localdomain python3[60287]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:16 np0005626463.localdomain sudo[60285]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:17 np0005626463.localdomain sudo[60335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zllbfpujwgpeijvnrgzdpkaupaiohkcu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:17 np0005626463.localdomain sudo[60335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:17 np0005626463.localdomain python3[60337]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:17 np0005626463.localdomain sudo[60335]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:17 np0005626463.localdomain sudo[60353]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuuklmhtypagdonbvzsigtetsktuymrv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:17 np0005626463.localdomain sudo[60353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:17 np0005626463.localdomain python3[60355]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:17 np0005626463.localdomain sudo[60353]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:18 np0005626463.localdomain sudo[60415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhankxvwosnwhwtwxhztqscqkeexdtmz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:18 np0005626463.localdomain sudo[60415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:18 np0005626463.localdomain python3[60417]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:18 np0005626463.localdomain sudo[60415]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:18 np0005626463.localdomain sudo[60433]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghrqmsyjijyfrekjqxtbvdwjlpnzvxge ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:18 np0005626463.localdomain sudo[60433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:18 np0005626463.localdomain python3[60435]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:18 np0005626463.localdomain sudo[60433]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:19 np0005626463.localdomain sudo[60495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkywuzvmkdpxzjnmhzdydriukaddidwl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:19 np0005626463.localdomain sudo[60495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:19 np0005626463.localdomain python3[60497]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:19 np0005626463.localdomain sudo[60495]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:19 np0005626463.localdomain sudo[60513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xluftznsvbemjnxegnhiqddgtzumibeg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:19 np0005626463.localdomain sudo[60513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:19 np0005626463.localdomain python3[60515]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:19 np0005626463.localdomain sudo[60513]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:19 np0005626463.localdomain sudo[60575]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgqrmlnofghhhogtlvilpqroiokeocoi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:19 np0005626463.localdomain sudo[60575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:20 np0005626463.localdomain python3[60577]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:20 np0005626463.localdomain sudo[60575]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:20 np0005626463.localdomain sudo[60593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwbcuzldcmdtuyudgbwdvdewhxukcpcy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:20 np0005626463.localdomain sudo[60593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:20 np0005626463.localdomain python3[60595]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:20 np0005626463.localdomain sudo[60593]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:20 np0005626463.localdomain sudo[60623]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njlheyxjzamhrwfzmsamfybmpliltrjr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:20 np0005626463.localdomain sudo[60623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:20 np0005626463.localdomain python3[60625]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:20 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:21 np0005626463.localdomain systemd-sysv-generator[60653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:21 np0005626463.localdomain systemd-rc-local-generator[60648]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:21 np0005626463.localdomain sudo[60623]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:21 np0005626463.localdomain sudo[60708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvpkugifeqwdtcseetywxqhffbklkqpo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:21 np0005626463.localdomain sudo[60708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:21 np0005626463.localdomain python3[60710]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:21 np0005626463.localdomain sudo[60708]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:21 np0005626463.localdomain sudo[60726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-libsfzsmyeeirqepmlyeqekwyfwpbweq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:21 np0005626463.localdomain sudo[60726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:22 np0005626463.localdomain python3[60728]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:22 np0005626463.localdomain sudo[60726]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:22 np0005626463.localdomain sudo[60788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beusjydjzvgznpgtdhvcjqxkpowruvbk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:22 np0005626463.localdomain sudo[60788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:22 np0005626463.localdomain python3[60790]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:04:22 np0005626463.localdomain sudo[60788]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:22 np0005626463.localdomain sudo[60806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajzicivplmdamhpxdbfcnhtgrryxowva ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:22 np0005626463.localdomain sudo[60806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:22 np0005626463.localdomain python3[60808]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:23 np0005626463.localdomain sudo[60806]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:23 np0005626463.localdomain sudo[60836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeeiaohzbbdqlkpmwwfvoleokcxvsotw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:23 np0005626463.localdomain sudo[60836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:23 np0005626463.localdomain python3[60838]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:23 np0005626463.localdomain systemd-rc-local-generator[60859]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:23 np0005626463.localdomain systemd-sysv-generator[60863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 08:04:23 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 08:04:23 np0005626463.localdomain sudo[60836]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:24 np0005626463.localdomain sudo[60895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-namlcffecufeaqrwyeyvaositupvnauy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:24 np0005626463.localdomain sudo[60895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:24 np0005626463.localdomain python3[60897]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 08:04:24 np0005626463.localdomain sudo[60895]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:24 np0005626463.localdomain sudo[60911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vijxjkdmitiysbiznqsfqyuvdzubrtrp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:24 np0005626463.localdomain sudo[60911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:25 np0005626463.localdomain sudo[60911]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:26 np0005626463.localdomain sudo[60954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsswsttbvcusljargnljetybgliflnrk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:26 np0005626463.localdomain sudo[60954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:26 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 08:04:27 np0005626463.localdomain podman[61111]: 2026-02-23 08:04:27.183177363 +0000 UTC m=+0.075249914 container create 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 08:04:27 np0005626463.localdomain podman[61125]: 2026-02-23 08:04:27.188351979 +0000 UTC m=+0.070920175 container create 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:04:27 np0005626463.localdomain podman[61140]: 2026-02-23 08:04:27.214351222 +0000 UTC m=+0.075489731 container create 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.5)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:27 np0005626463.localdomain podman[61111]: 2026-02-23 08:04:27.140020599 +0000 UTC m=+0.032093160 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46/merged/scripts supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain podman[61125]: 2026-02-23 08:04:27.145677221 +0000 UTC m=+0.028245427 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 23 08:04:27 np0005626463.localdomain podman[61138]: 2026-02-23 08:04:27.249449618 +0000 UTC m=+0.112535449 container create 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope.
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.175182456 +0000 UTC m=+0.032614666 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:04:27 np0005626463.localdomain podman[61140]: 2026-02-23 08:04:27.178180462 +0000 UTC m=+0.039319001 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 23 08:04:27 np0005626463.localdomain podman[61138]: 2026-02-23 08:04:27.178546484 +0000 UTC m=+0.041632315 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e3fe691531a0d3ed4e0bd844aee95e09028b37c75ae9985cc1386696cb9ad2a/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain podman[61111]: 2026-02-23 08:04:27.293906584 +0000 UTC m=+0.185979145 container init 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2026-01-12T23:31:49Z, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z)
Feb 23 08:04:27 np0005626463.localdomain podman[61111]: 2026-02-23 08:04:27.299662169 +0000 UTC m=+0.191734730 container start 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtlogd_wrapper, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git)
Feb 23 08:04:27 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:04:27 np0005626463.localdomain podman[61125]: 2026-02-23 08:04:27.321376154 +0000 UTC m=+0.203944380 container init 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:04:27 np0005626463.localdomain sudo[61208]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:27 np0005626463.localdomain sudo[61223]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:27 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.344096244 +0000 UTC m=+0.201528454 container create c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:04:27 np0005626463.localdomain podman[61140]: 2026-02-23 08:04:27.34773485 +0000 UTC m=+0.208873369 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 08:04:27 np0005626463.localdomain podman[61140]: 2026-02-23 08:04:27.357641367 +0000 UTC m=+0.218779886 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 23 08:04:27 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 08:04:27 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=8e5028e38f7077561ef1e3e50ec174a3 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 23 08:04:27 np0005626463.localdomain sudo[61232]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:27 np0005626463.localdomain sudo[61232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 23 08:04:27 np0005626463.localdomain podman[61138]: 2026-02-23 08:04:27.394939714 +0000 UTC m=+0.258025545 container init 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=ceilometer_init_log, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:27 np0005626463.localdomain podman[61125]: 2026-02-23 08:04:27.405265735 +0000 UTC m=+0.287833921 container start 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 23 08:04:27 np0005626463.localdomain sudo[61232]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain podman[61236]: 2026-02-23 08:04:27.440408612 +0000 UTC m=+0.060881393 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, version=17.1.13, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope.
Feb 23 08:04:27 np0005626463.localdomain podman[61138]: 2026-02-23 08:04:27.509805068 +0000 UTC m=+0.372890919 container start 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_init_log, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:27 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Feb 23 08:04:27 np0005626463.localdomain podman[61256]: 2026-02-23 08:04:27.514795678 +0000 UTC m=+0.096026451 container died 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_init_log, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Queued start job for default target Main User Target.
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.523127124 +0000 UTC m=+0.380559324 container init c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_statedir_owner)
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Created slice User Application Slice.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Reached target Paths.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Reached target Timers.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Starting D-Bus User Message Bus Socket...
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Starting Create User's Volatile Files and Directories...
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.530247403 +0000 UTC m=+0.387679603 container start c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.530416218 +0000 UTC m=+0.387848408 container attach c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Finished Create User's Volatile Files and Directories.
Feb 23 08:04:27 np0005626463.localdomain podman[61256]: 2026-02-23 08:04:27.540927476 +0000 UTC m=+0.122158209 container cleanup 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Listening on D-Bus User Message Bus Socket.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Reached target Sockets.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Reached target Basic System.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Reached target Main User Target.
Feb 23 08:04:27 np0005626463.localdomain systemd[61241]: Startup finished in 131ms.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started User Manager for UID 0.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started Session c1 of User root.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: Started Session c2 of User root.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-conmon-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain sudo[61208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:27 np0005626463.localdomain sudo[61223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain podman[61143]: 2026-02-23 08:04:27.576465466 +0000 UTC m=+0.433897686 container died c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:04:27 np0005626463.localdomain podman[61278]: 2026-02-23 08:04:27.622099499 +0000 UTC m=+0.173157514 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, release=1766032510, io.openshift.expose-services=, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 08:04:27 np0005626463.localdomain sudo[61208]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-conmon-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain sudo[61223]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain podman[61225]: 2026-02-23 08:04:27.594186344 +0000 UTC m=+0.236925550 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64)
Feb 23 08:04:27 np0005626463.localdomain podman[61225]: 2026-02-23 08:04:27.677154434 +0000 UTC m=+0.319893620 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:04:27 np0005626463.localdomain podman[61225]: unhealthy
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'.
Feb 23 08:04:27 np0005626463.localdomain podman[61363]: 2026-02-23 08:04:27.761150198 +0000 UTC m=+0.169936850 container cleanup c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:04:27 np0005626463.localdomain systemd[1]: libpod-conmon-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope: Deactivated successfully.
Feb 23 08:04:27 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Feb 23 08:04:27 np0005626463.localdomain podman[61473]: 2026-02-23 08:04:27.965315365 +0000 UTC m=+0.076738451 container create 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libpod-conmon-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain podman[61473]: 2026-02-23 08:04:27.93428612 +0000 UTC m=+0.045709246 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:28 np0005626463.localdomain podman[61473]: 2026-02-23 08:04:28.034134573 +0000 UTC m=+0.145557739 container init 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.)
Feb 23 08:04:28 np0005626463.localdomain podman[61473]: 2026-02-23 08:04:28.040495736 +0000 UTC m=+0.151918822 container start 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, distribution-scope=public)
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully.
Feb 23 08:04:28 np0005626463.localdomain podman[61536]: 2026-02-23 08:04:28.237567637 +0000 UTC m=+0.087666583 container create c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtsecretd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:04:28 np0005626463.localdomain podman[61536]: 2026-02-23 08:04:28.191152968 +0000 UTC m=+0.041251934 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libpod-conmon-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain podman[61536]: 2026-02-23 08:04:28.338427991 +0000 UTC m=+0.188526927 container init c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 23 08:04:28 np0005626463.localdomain podman[61536]: 2026-02-23 08:04:28.350409205 +0000 UTC m=+0.200508141 container start c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']})
Feb 23 08:04:28 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:28 np0005626463.localdomain sudo[61555]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:28 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started Session c3 of User root.
Feb 23 08:04:28 np0005626463.localdomain sudo[61555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:28 np0005626463.localdomain sudo[61555]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Feb 23 08:04:28 np0005626463.localdomain podman[61676]: 2026-02-23 08:04:28.83016426 +0000 UTC m=+0.073164787 container create 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, batch=17.1_20260112.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:34:43Z, vcs-type=git, config_id=tripleo_step3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public)
Feb 23 08:04:28 np0005626463.localdomain podman[61677]: 2026-02-23 08:04:28.858432397 +0000 UTC m=+0.097323942 container create 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container)
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libpod-conmon-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libpod-conmon-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope.
Feb 23 08:04:28 np0005626463.localdomain podman[61676]: 2026-02-23 08:04:28.794078063 +0000 UTC m=+0.037078550 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 23 08:04:28 np0005626463.localdomain podman[61677]: 2026-02-23 08:04:28.795723196 +0000 UTC m=+0.034614791 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:04:28 np0005626463.localdomain podman[61676]: 2026-02-23 08:04:28.934148415 +0000 UTC m=+0.177148982 container init 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 23 08:04:28 np0005626463.localdomain sudo[61715]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:04:28 np0005626463.localdomain podman[61677]: 2026-02-23 08:04:28.966627067 +0000 UTC m=+0.205518612 container init 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3)
Feb 23 08:04:28 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:28 np0005626463.localdomain podman[61676]: 2026-02-23 08:04:28.974191819 +0000 UTC m=+0.217192336 container start 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:04:28 np0005626463.localdomain systemd[1]: Started Session c4 of User root.
Feb 23 08:04:28 np0005626463.localdomain podman[61677]: 2026-02-23 08:04:28.979774778 +0000 UTC m=+0.218666313 container start 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com)
Feb 23 08:04:28 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=45772c82d00b8348e0440509154d74a9 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 23 08:04:28 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:28 np0005626463.localdomain sudo[61715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:29 np0005626463.localdomain sudo[61724]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:29 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: Started Session c5 of User root.
Feb 23 08:04:29 np0005626463.localdomain sudo[61724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:29 np0005626463.localdomain sudo[61715]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Feb 23 08:04:29 np0005626463.localdomain podman[61716]: 2026-02-23 08:04:29.066316663 +0000 UTC m=+0.088535379 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:04:29 np0005626463.localdomain kernel: Loading iSCSI transport class v2.0-870.
Feb 23 08:04:29 np0005626463.localdomain sudo[61724]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Feb 23 08:04:29 np0005626463.localdomain podman[61716]: 2026-02-23 08:04:29.153773499 +0000 UTC m=+0.175992215 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1)
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:04:29 np0005626463.localdomain podman[61859]: 2026-02-23 08:04:29.566829715 +0000 UTC m=+0.087585260 container create 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_virtstoraged, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Feb 23 08:04:29 np0005626463.localdomain podman[61859]: 2026-02-23 08:04:29.517855175 +0000 UTC m=+0.038610730 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: Started libpod-conmon-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope.
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:29 np0005626463.localdomain podman[61859]: 2026-02-23 08:04:29.649212766 +0000 UTC m=+0.169968311 container init 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 23 08:04:29 np0005626463.localdomain podman[61859]: 2026-02-23 08:04:29.658503295 +0000 UTC m=+0.179258840 container start 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtstoraged, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, release=1766032510, io.openshift.expose-services=)
Feb 23 08:04:29 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:29 np0005626463.localdomain sudo[61878]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:29 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: Started Session c6 of User root.
Feb 23 08:04:29 np0005626463.localdomain sudo[61878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:29 np0005626463.localdomain sudo[61878]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:29 np0005626463.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Feb 23 08:04:30 np0005626463.localdomain podman[61963]: 2026-02-23 08:04:30.126584146 +0000 UTC m=+0.083417057 container create ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z)
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started libpod-conmon-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope.
Feb 23 08:04:30 np0005626463.localdomain podman[61963]: 2026-02-23 08:04:30.082637176 +0000 UTC m=+0.039470097 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain podman[61963]: 2026-02-23 08:04:30.197692366 +0000 UTC m=+0.154525277 container init ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, container_name=nova_virtqemud, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, release=1766032510, io.openshift.expose-services=, tcib_managed=true)
Feb 23 08:04:30 np0005626463.localdomain podman[61963]: 2026-02-23 08:04:30.2099671 +0000 UTC m=+0.166800021 container start ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 23 08:04:30 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:30 np0005626463.localdomain sudo[61983]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:30 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started Session c7 of User root.
Feb 23 08:04:30 np0005626463.localdomain sudo[61983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:30 np0005626463.localdomain sudo[61983]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Feb 23 08:04:30 np0005626463.localdomain podman[62068]: 2026-02-23 08:04:30.66399139 +0000 UTC m=+0.082519438 container create 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, container_name=nova_virtproxyd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started libpod-conmon-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope.
Feb 23 08:04:30 np0005626463.localdomain podman[62068]: 2026-02-23 08:04:30.618427558 +0000 UTC m=+0.036955586 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:30 np0005626463.localdomain podman[62068]: 2026-02-23 08:04:30.741161994 +0000 UTC m=+0.159689992 container init 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 08:04:30 np0005626463.localdomain podman[62068]: 2026-02-23 08:04:30.750954709 +0000 UTC m=+0.169482717 container start 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtproxyd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13)
Feb 23 08:04:30 np0005626463.localdomain python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:04:30 np0005626463.localdomain sudo[62088]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:30 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: Started Session c8 of User root.
Feb 23 08:04:30 np0005626463.localdomain sudo[62088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:30 np0005626463.localdomain sudo[62088]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:30 np0005626463.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Feb 23 08:04:30 np0005626463.localdomain sudo[60954]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:31 np0005626463.localdomain sudo[62148]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rakxctqujrgorpjkehubavrrsnekznav ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:31 np0005626463.localdomain sudo[62148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:31 np0005626463.localdomain python3[62150]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:31 np0005626463.localdomain sudo[62148]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:31 np0005626463.localdomain sudo[62164]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpxvrlobsfgpswoaxrunrhhrzrgsqzbe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:31 np0005626463.localdomain sudo[62164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:31 np0005626463.localdomain python3[62166]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:31 np0005626463.localdomain sudo[62164]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:31 np0005626463.localdomain sudo[62180]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnhvwwadxislpauvjgdwlmsmedxedslj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:31 np0005626463.localdomain sudo[62180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:31 np0005626463.localdomain python3[62182]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:31 np0005626463.localdomain sudo[62180]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:31 np0005626463.localdomain sudo[62196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmdfjmbbtiueelpuhfskafxeqmixmknf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:31 np0005626463.localdomain sudo[62196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:32 np0005626463.localdomain python3[62198]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:32 np0005626463.localdomain sudo[62196]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:32 np0005626463.localdomain sudo[62212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpvvoeunxzvotnacgfifmwtuadwjzcjg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:32 np0005626463.localdomain sudo[62212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:32 np0005626463.localdomain python3[62214]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:32 np0005626463.localdomain sudo[62212]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:32 np0005626463.localdomain sudo[62228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjgyqkixraliqcukqcnzaebymxqguohf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:32 np0005626463.localdomain sudo[62228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:32 np0005626463.localdomain python3[62230]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:32 np0005626463.localdomain sudo[62228]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:32 np0005626463.localdomain sudo[62244]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nknctggustukkgbiuctztjvmoqutgttp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:32 np0005626463.localdomain sudo[62244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:32 np0005626463.localdomain python3[62246]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:32 np0005626463.localdomain sudo[62244]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:33 np0005626463.localdomain sudo[62260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxketoxuisdwhxuvrrcaczqawxkeaxff ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:33 np0005626463.localdomain sudo[62260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:33 np0005626463.localdomain python3[62262]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:33 np0005626463.localdomain sudo[62260]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:33 np0005626463.localdomain sudo[62276]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zblxzjdesjxemotohrvvhxjqywcponob ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:33 np0005626463.localdomain sudo[62276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:33 np0005626463.localdomain python3[62278]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:33 np0005626463.localdomain sudo[62276]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:33 np0005626463.localdomain sudo[62293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-defxjqtxetapawuqrjlyptvzeolljfyo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:33 np0005626463.localdomain sudo[62293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:33 np0005626463.localdomain python3[62295]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:33 np0005626463.localdomain sudo[62293]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:33 np0005626463.localdomain sudo[62309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksyztusczlzmjnbtuongpzughoicxprh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:33 np0005626463.localdomain sudo[62309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:33 np0005626463.localdomain python3[62311]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:34 np0005626463.localdomain sudo[62309]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:34 np0005626463.localdomain sudo[62325]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htjtxxvcydhfuggkoesqdepxtlinhrjr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:34 np0005626463.localdomain sudo[62325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:34 np0005626463.localdomain python3[62327]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:34 np0005626463.localdomain sudo[62325]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:34 np0005626463.localdomain sudo[62341]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnujjwznphamxbrpdzljvupfrgzvzfqs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:34 np0005626463.localdomain sudo[62341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:34 np0005626463.localdomain python3[62343]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:34 np0005626463.localdomain sudo[62341]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:34 np0005626463.localdomain sudo[62357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsackojenbjjrrjnkqhxlxlorlorwxrc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:34 np0005626463.localdomain sudo[62357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:34 np0005626463.localdomain python3[62359]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:34 np0005626463.localdomain sudo[62357]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:34 np0005626463.localdomain sudo[62373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igvlczuynxazyjzhwxwgsxkbodqdifad ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:34 np0005626463.localdomain sudo[62373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:34 np0005626463.localdomain python3[62375]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:35 np0005626463.localdomain sudo[62373]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:35 np0005626463.localdomain sudo[62389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsinybisvoujdpexvdwnmqrwtugmnllk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:35 np0005626463.localdomain sudo[62389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:35 np0005626463.localdomain python3[62391]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:35 np0005626463.localdomain sudo[62389]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:35 np0005626463.localdomain sudo[62405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hakoiiuduedciqwdbkuxwwqtyvribhsc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:35 np0005626463.localdomain sudo[62405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:35 np0005626463.localdomain python3[62407]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:35 np0005626463.localdomain sudo[62405]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:35 np0005626463.localdomain sudo[62421]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtfwvicvmkypajrvbrbcepdgkuaajfpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:35 np0005626463.localdomain sudo[62421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:35 np0005626463.localdomain python3[62423]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:04:35 np0005626463.localdomain sudo[62421]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:36 np0005626463.localdomain sudo[62482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqctvgaddobeerclfqhdnmvatlvfchht ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:36 np0005626463.localdomain sudo[62482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:36 np0005626463.localdomain python3[62484]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:36 np0005626463.localdomain sudo[62482]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:36 np0005626463.localdomain sudo[62511]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-visqfxqkamvazguzofmojhryflvnmbpx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:36 np0005626463.localdomain sudo[62511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:37 np0005626463.localdomain python3[62513]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:37 np0005626463.localdomain sudo[62511]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:37 np0005626463.localdomain sudo[62540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiwrafuugbzbbqaoorektiyzvujqhule ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:37 np0005626463.localdomain sudo[62540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:04:37 np0005626463.localdomain systemd[1]: tmp-crun.RKrzlX.mount: Deactivated successfully.
Feb 23 08:04:37 np0005626463.localdomain podman[62543]: 2026-02-23 08:04:37.515075018 +0000 UTC m=+0.107049014 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 23 08:04:37 np0005626463.localdomain python3[62542]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:37 np0005626463.localdomain sudo[62540]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:37 np0005626463.localdomain podman[62543]: 2026-02-23 08:04:37.717175129 +0000 UTC m=+0.309149115 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:04:37 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:04:37 np0005626463.localdomain sudo[62599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlaifupkijbyrziqvgplzmoaioimuujq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:37 np0005626463.localdomain sudo[62599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:38 np0005626463.localdomain python3[62601]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:38 np0005626463.localdomain sudo[62599]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:38 np0005626463.localdomain sudo[62628]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfesrgxeankptxpbysgygfqnwlpgmvjy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:38 np0005626463.localdomain sudo[62628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:38 np0005626463.localdomain python3[62630]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:38 np0005626463.localdomain sudo[62628]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:38 np0005626463.localdomain sudo[62657]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmlyalcgsjmbgloheylwblyvuzxlhtpz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:38 np0005626463.localdomain sudo[62657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:39 np0005626463.localdomain python3[62659]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:39 np0005626463.localdomain sudo[62657]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:39 np0005626463.localdomain sshd[62660]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:04:39 np0005626463.localdomain sudo[62688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piavvzewauaveqywzwiozqsvkxhidsnp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:39 np0005626463.localdomain sudo[62688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:39 np0005626463.localdomain sshd[62660]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:04:39 np0005626463.localdomain python3[62690]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:39 np0005626463.localdomain sudo[62688]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:40 np0005626463.localdomain sudo[62717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-seaqlgaytxhfpzgifjlkhumymdmtsfgo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:40 np0005626463.localdomain sudo[62717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:40 np0005626463.localdomain python3[62719]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:40 np0005626463.localdomain sudo[62717]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:40 np0005626463.localdomain sudo[62746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvccgvopfdvreqxmaeagbsjlgyrqndzi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:40 np0005626463.localdomain sudo[62746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:40 np0005626463.localdomain python3[62748]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:40 np0005626463.localdomain sudo[62746]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:40 np0005626463.localdomain sudo[62762]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceqdqbdgszycymvsoivpvuakecqmicnn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:40 np0005626463.localdomain sudo[62762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Activating special unit Exit the Session...
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped target Main User Target.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped target Basic System.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped target Paths.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped target Sockets.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped target Timers.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Closed D-Bus User Message Bus Socket.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Stopped Create User's Volatile Files and Directories.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Removed slice User Application Slice.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Reached target Shutdown.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Finished Exit the Session.
Feb 23 08:04:41 np0005626463.localdomain systemd[61241]: Reached target Exit the Session.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 23 08:04:41 np0005626463.localdomain python3[62764]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:41 np0005626463.localdomain sshd[62767]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:04:41 np0005626463.localdomain systemd-rc-local-generator[62789]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:41 np0005626463.localdomain systemd-sysv-generator[62795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:41 np0005626463.localdomain sudo[62762]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:41 np0005626463.localdomain sudo[62817]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhvdubahfsroekdagkmfvlabwdhaulma ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:41 np0005626463.localdomain sudo[62817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:41 np0005626463.localdomain python3[62819]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:42 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:42 np0005626463.localdomain systemd-rc-local-generator[62848]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:42 np0005626463.localdomain systemd-sysv-generator[62852]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:42 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:42 np0005626463.localdomain systemd[1]: Starting collectd container...
Feb 23 08:04:42 np0005626463.localdomain systemd[1]: Started collectd container.
Feb 23 08:04:42 np0005626463.localdomain sudo[62817]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:42 np0005626463.localdomain sshd[62767]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:04:42 np0005626463.localdomain sudo[62883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpnrcfojmwtegujisbqfvmgyesusuvhd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:42 np0005626463.localdomain sudo[62883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:43 np0005626463.localdomain python3[62885]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:43 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:43 np0005626463.localdomain systemd-rc-local-generator[62912]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:43 np0005626463.localdomain systemd-sysv-generator[62917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:43 np0005626463.localdomain systemd[1]: Starting iscsid container...
Feb 23 08:04:43 np0005626463.localdomain systemd[1]: Started iscsid container.
Feb 23 08:04:43 np0005626463.localdomain sudo[62883]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:43 np0005626463.localdomain sudo[62949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgtztwwzsmuoqlzweqifzhyggkfgpyzj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:43 np0005626463.localdomain sudo[62949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:44 np0005626463.localdomain python3[62951]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:44 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:44 np0005626463.localdomain systemd-sysv-generator[62983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:44 np0005626463.localdomain systemd-rc-local-generator[62976]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:44 np0005626463.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Feb 23 08:04:44 np0005626463.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Feb 23 08:04:44 np0005626463.localdomain sudo[62949]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:44 np0005626463.localdomain sudo[63015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuziqazrjkvjsuiniwmzdudkpweblwux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:44 np0005626463.localdomain sudo[63015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:45 np0005626463.localdomain python3[63017]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:45 np0005626463.localdomain systemd-sysv-generator[63048]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:45 np0005626463.localdomain systemd-rc-local-generator[63043]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:45 np0005626463.localdomain systemd[1]: Starting nova_virtnodedevd container...
Feb 23 08:04:45 np0005626463.localdomain tripleo-start-podman-container[63057]: Creating additional drop-in dependency for "nova_virtnodedevd" (930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2)
Feb 23 08:04:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:45 np0005626463.localdomain systemd-sysv-generator[63119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:45 np0005626463.localdomain systemd-rc-local-generator[63115]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:46 np0005626463.localdomain systemd[1]: Started nova_virtnodedevd container.
Feb 23 08:04:46 np0005626463.localdomain sudo[63015]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:46 np0005626463.localdomain sudo[63138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dassvrcqooxapoolxxhourzijvvgasuy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:46 np0005626463.localdomain sudo[63138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:46 np0005626463.localdomain python3[63140]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:46 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:46 np0005626463.localdomain systemd-rc-local-generator[63166]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:46 np0005626463.localdomain systemd-sysv-generator[63171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:47 np0005626463.localdomain systemd[1]: Starting nova_virtproxyd container...
Feb 23 08:04:47 np0005626463.localdomain tripleo-start-podman-container[63179]: Creating additional drop-in dependency for "nova_virtproxyd" (33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a)
Feb 23 08:04:47 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:47 np0005626463.localdomain systemd-rc-local-generator[63235]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:47 np0005626463.localdomain systemd-sysv-generator[63240]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:47 np0005626463.localdomain systemd[1]: Started nova_virtproxyd container.
Feb 23 08:04:47 np0005626463.localdomain sudo[63138]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:47 np0005626463.localdomain sudo[63260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scaofileedadnwhsiqxaxlkwlomlidlj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:47 np0005626463.localdomain sudo[63260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:48 np0005626463.localdomain python3[63262]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:48 np0005626463.localdomain systemd-rc-local-generator[63287]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:48 np0005626463.localdomain systemd-sysv-generator[63293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: Starting nova_virtqemud container...
Feb 23 08:04:48 np0005626463.localdomain tripleo-start-podman-container[63302]: Creating additional drop-in dependency for "nova_virtqemud" (ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468)
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:48 np0005626463.localdomain systemd-rc-local-generator[63358]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:48 np0005626463.localdomain systemd-sysv-generator[63363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:48 np0005626463.localdomain systemd[1]: Started nova_virtqemud container.
Feb 23 08:04:49 np0005626463.localdomain sudo[63260]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:49 np0005626463.localdomain sudo[63383]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gidvvatwvrewmuewjhpibaumifyerzpk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:49 np0005626463.localdomain sudo[63383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:49 np0005626463.localdomain python3[63385]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:49 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:49 np0005626463.localdomain systemd-sysv-generator[63417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:49 np0005626463.localdomain systemd-rc-local-generator[63413]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:50 np0005626463.localdomain systemd[1]: Starting nova_virtsecretd container...
Feb 23 08:04:50 np0005626463.localdomain tripleo-start-podman-container[63425]: Creating additional drop-in dependency for "nova_virtsecretd" (c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f)
Feb 23 08:04:50 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:50 np0005626463.localdomain systemd-sysv-generator[63488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:50 np0005626463.localdomain systemd-rc-local-generator[63482]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:50 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:50 np0005626463.localdomain systemd[1]: Started nova_virtsecretd container.
Feb 23 08:04:50 np0005626463.localdomain sudo[63383]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:50 np0005626463.localdomain sudo[63508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylxebqupgmwleuaidlcjfpumubxtyaco ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:50 np0005626463.localdomain sudo[63508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:51 np0005626463.localdomain python3[63510]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:51 np0005626463.localdomain systemd-rc-local-generator[63535]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:51 np0005626463.localdomain systemd-sysv-generator[63538]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: Starting nova_virtstoraged container...
Feb 23 08:04:51 np0005626463.localdomain tripleo-start-podman-container[63550]: Creating additional drop-in dependency for "nova_virtstoraged" (5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843)
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:51 np0005626463.localdomain systemd-sysv-generator[63612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:51 np0005626463.localdomain systemd-rc-local-generator[63607]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:51 np0005626463.localdomain systemd[1]: Started nova_virtstoraged container.
Feb 23 08:04:52 np0005626463.localdomain sudo[63508]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:52 np0005626463.localdomain sudo[63632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqxkhucarvniwalarmrbdbuabndxzlsq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:04:52 np0005626463.localdomain sudo[63632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:52 np0005626463.localdomain python3[63634]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:04:52 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:04:52 np0005626463.localdomain systemd-rc-local-generator[63661]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:04:52 np0005626463.localdomain systemd-sysv-generator[63666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:04:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:04:52 np0005626463.localdomain systemd[1]: Starting rsyslog container...
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tmp-crun.d9cMHi.mount: Deactivated successfully.
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:53 np0005626463.localdomain podman[63674]: 2026-02-23 08:04:53.082321327 +0000 UTC m=+0.125412534 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, container_name=rsyslog, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Feb 23 08:04:53 np0005626463.localdomain podman[63674]: 2026-02-23 08:04:53.092587696 +0000 UTC m=+0.135678903 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, build-date=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510)
Feb 23 08:04:53 np0005626463.localdomain podman[63674]: rsyslog
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Started rsyslog container.
Feb 23 08:04:53 np0005626463.localdomain sudo[63692]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:53 np0005626463.localdomain sudo[63692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:53 np0005626463.localdomain sudo[63632]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:53 np0005626463.localdomain sudo[63692]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:53 np0005626463.localdomain podman[63708]: 2026-02-23 08:04:53.260015056 +0000 UTC m=+0.049554321 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:04:53 np0005626463.localdomain podman[63708]: 2026-02-23 08:04:53.286398571 +0000 UTC m=+0.075937806 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.13, build-date=2026-01-12T22:10:09Z, vcs-type=git, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=)
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:53 np0005626463.localdomain podman[63723]: 2026-02-23 08:04:53.373600938 +0000 UTC m=+0.062450624 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T22:10:09Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, vendor=Red Hat, Inc., container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 08:04:53 np0005626463.localdomain podman[63723]: rsyslog
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:53 np0005626463.localdomain sudo[63748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jubydqboerojujnutprwsexlcmkydffb ; /usr/bin/python3
Feb 23 08:04:53 np0005626463.localdomain sudo[63748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Stopped rsyslog container.
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Starting rsyslog container...
Feb 23 08:04:53 np0005626463.localdomain python3[63750]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:53 np0005626463.localdomain sudo[63748]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:53 np0005626463.localdomain podman[63751]: 2026-02-23 08:04:53.668308438 +0000 UTC m=+0.094222491 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:04:53 np0005626463.localdomain podman[63751]: 2026-02-23 08:04:53.678274249 +0000 UTC m=+0.104188322 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 08:04:53 np0005626463.localdomain podman[63751]: rsyslog
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: Started rsyslog container.
Feb 23 08:04:53 np0005626463.localdomain sudo[63769]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:53 np0005626463.localdomain sudo[63769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:53 np0005626463.localdomain sudo[63769]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:53 np0005626463.localdomain podman[63772]: 2026-02-23 08:04:53.846503873 +0000 UTC m=+0.054848610 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, com.redhat.component=openstack-rsyslog-container)
Feb 23 08:04:53 np0005626463.localdomain podman[63772]: 2026-02-23 08:04:53.872967182 +0000 UTC m=+0.081311879 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, release=1766032510, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:09Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:53 np0005626463.localdomain podman[63784]: 2026-02-23 08:04:53.959256859 +0000 UTC m=+0.059393895 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, version=17.1.13, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 23 08:04:53 np0005626463.localdomain podman[63784]: rsyslog
Feb 23 08:04:53 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully.
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully.
Feb 23 08:04:54 np0005626463.localdomain sudo[63840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yegxhdvbekdueabmindviqplpakluewi ; /usr/bin/python3
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Stopped rsyslog container.
Feb 23 08:04:54 np0005626463.localdomain sudo[63840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Starting rsyslog container...
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:54 np0005626463.localdomain podman[63843]: 2026-02-23 08:04:54.305442031 +0000 UTC m=+0.124184493 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, container_name=rsyslog)
Feb 23 08:04:54 np0005626463.localdomain podman[63843]: 2026-02-23 08:04:54.314803031 +0000 UTC m=+0.133545503 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=rsyslog, release=1766032510, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, distribution-scope=public)
Feb 23 08:04:54 np0005626463.localdomain podman[63843]: rsyslog
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Started rsyslog container.
Feb 23 08:04:54 np0005626463.localdomain sudo[63863]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:54 np0005626463.localdomain sudo[63863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:54 np0005626463.localdomain sudo[63840]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:54 np0005626463.localdomain sudo[63863]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:54 np0005626463.localdomain podman[63877]: 2026-02-23 08:04:54.466165255 +0000 UTC m=+0.050166010 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=rsyslog, build-date=2026-01-12T22:10:09Z)
Feb 23 08:04:54 np0005626463.localdomain podman[63877]: 2026-02-23 08:04:54.486660953 +0000 UTC m=+0.070661708 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, architecture=x86_64, tcib_managed=true, release=1766032510, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog)
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:54 np0005626463.localdomain podman[63905]: 2026-02-23 08:04:54.572452734 +0000 UTC m=+0.059589862 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com)
Feb 23 08:04:54 np0005626463.localdomain podman[63905]: rsyslog
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:54 np0005626463.localdomain sudo[63929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeftmhocbauoxfllvqemnnlvwquxebzn ; /usr/bin/python3
Feb 23 08:04:54 np0005626463.localdomain sudo[63929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:54 np0005626463.localdomain sudo[63929]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Stopped rsyslog container.
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Starting rsyslog container...
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:54 np0005626463.localdomain podman[63937]: 2026-02-23 08:04:54.922750328 +0000 UTC m=+0.098445808 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.created=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team)
Feb 23 08:04:54 np0005626463.localdomain podman[63937]: 2026-02-23 08:04:54.932036186 +0000 UTC m=+0.107731656 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, container_name=rsyslog, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:04:54 np0005626463.localdomain podman[63937]: rsyslog
Feb 23 08:04:54 np0005626463.localdomain systemd[1]: Started rsyslog container.
Feb 23 08:04:54 np0005626463.localdomain sudo[63967]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:54 np0005626463.localdomain sudo[63967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:55 np0005626463.localdomain sudo[63967]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:55 np0005626463.localdomain sudo[63992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhclwyerdyqbxzgzepxzvlwwjzfminyp ; /usr/bin/python3
Feb 23 08:04:55 np0005626463.localdomain sudo[63992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:55 np0005626463.localdomain podman[63971]: 2026-02-23 08:04:55.095274891 +0000 UTC m=+0.057289189 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, release=1766032510, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, container_name=rsyslog, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible)
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tmp-crun.rWPFJj.mount: Deactivated successfully.
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully.
Feb 23 08:04:55 np0005626463.localdomain podman[63971]: 2026-02-23 08:04:55.123607089 +0000 UTC m=+0.085621337 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64)
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully.
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:55 np0005626463.localdomain podman[64000]: 2026-02-23 08:04:55.205521316 +0000 UTC m=+0.050048666 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true)
Feb 23 08:04:55 np0005626463.localdomain podman[64000]: rsyslog
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:55 np0005626463.localdomain python3[63998]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005626463 step=3 update_config_hash_only=False
Feb 23 08:04:55 np0005626463.localdomain sudo[63992]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: Stopped rsyslog container.
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: Starting rsyslog container...
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:04:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 23 08:04:55 np0005626463.localdomain podman[64013]: 2026-02-23 08:04:55.672254004 +0000 UTC m=+0.101827937 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true)
Feb 23 08:04:55 np0005626463.localdomain podman[64013]: 2026-02-23 08:04:55.681679556 +0000 UTC m=+0.111253489 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 23 08:04:55 np0005626463.localdomain podman[64013]: rsyslog
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: Started rsyslog container.
Feb 23 08:04:55 np0005626463.localdomain sudo[64033]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:04:55 np0005626463.localdomain sudo[64033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:04:55 np0005626463.localdomain sudo[64033]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully.
Feb 23 08:04:55 np0005626463.localdomain podman[64037]: 2026-02-23 08:04:55.812202622 +0000 UTC m=+0.041942696 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Feb 23 08:04:55 np0005626463.localdomain podman[64037]: 2026-02-23 08:04:55.837837874 +0000 UTC m=+0.067577898 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510)
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:04:55 np0005626463.localdomain podman[64051]: 2026-02-23 08:04:55.92565702 +0000 UTC m=+0.056067079 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:10:09Z, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1)
Feb 23 08:04:55 np0005626463.localdomain podman[64051]: rsyslog
Feb 23 08:04:55 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:56 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Feb 23 08:04:56 np0005626463.localdomain systemd[1]: Stopped rsyslog container.
Feb 23 08:04:56 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Feb 23 08:04:56 np0005626463.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 23 08:04:56 np0005626463.localdomain systemd[1]: Failed to start rsyslog container.
Feb 23 08:04:56 np0005626463.localdomain sudo[64077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdouyzycddnzaptqvbyjgaoztpviqktw ; /usr/bin/python3
Feb 23 08:04:56 np0005626463.localdomain sudo[64077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:56 np0005626463.localdomain python3[64079]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:04:56 np0005626463.localdomain sudo[64077]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:57 np0005626463.localdomain sudo[64093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjiyjceblbrauqmipdkpxqqevvcipgyl ; /usr/bin/python3
Feb 23 08:04:57 np0005626463.localdomain sudo[64093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:04:57 np0005626463.localdomain python3[64095]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 08:04:57 np0005626463.localdomain sudo[64093]: pam_unix(sudo:session): session closed for user root
Feb 23 08:04:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:04:57 np0005626463.localdomain podman[64096]: 2026-02-23 08:04:57.911385461 +0000 UTC m=+0.085181932 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, container_name=collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:04:57 np0005626463.localdomain podman[64096]: 2026-02-23 08:04:57.951235769 +0000 UTC m=+0.125032220 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13)
Feb 23 08:04:57 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:04:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:04:59 np0005626463.localdomain podman[64116]: 2026-02-23 08:04:59.89080482 +0000 UTC m=+0.070150881 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64)
Feb 23 08:04:59 np0005626463.localdomain podman[64116]: 2026-02-23 08:04:59.906175882 +0000 UTC m=+0.085521903 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Feb 23 08:04:59 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:05:04 np0005626463.localdomain sudo[64135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:05:04 np0005626463.localdomain sudo[64135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:05:04 np0005626463.localdomain sudo[64135]: pam_unix(sudo:session): session closed for user root
Feb 23 08:05:04 np0005626463.localdomain sudo[64150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:05:04 np0005626463.localdomain sudo[64150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:05:05 np0005626463.localdomain sudo[64150]: pam_unix(sudo:session): session closed for user root
Feb 23 08:05:06 np0005626463.localdomain sudo[64196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:05:06 np0005626463.localdomain sudo[64196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:05:06 np0005626463.localdomain sudo[64196]: pam_unix(sudo:session): session closed for user root
Feb 23 08:05:07 np0005626463.localdomain sshd[64211]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:05:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:05:07 np0005626463.localdomain podman[64212]: 2026-02-23 08:05:07.92344877 +0000 UTC m=+0.092097775 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z)
Feb 23 08:05:08 np0005626463.localdomain podman[64212]: 2026-02-23 08:05:08.129450267 +0000 UTC m=+0.298099292 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=metrics_qdr)
Feb 23 08:05:08 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:05:09 np0005626463.localdomain sshd[64211]: Invalid user admin from 185.156.73.233 port 50002
Feb 23 08:05:09 np0005626463.localdomain sshd[64211]: Connection closed by invalid user admin 185.156.73.233 port 50002 [preauth]
Feb 23 08:05:17 np0005626463.localdomain sshd[64242]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:05:18 np0005626463.localdomain sshd[64242]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:05:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:05:28 np0005626463.localdomain podman[64244]: 2026-02-23 08:05:28.919113594 +0000 UTC m=+0.091210236 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:05:28 np0005626463.localdomain podman[64244]: 2026-02-23 08:05:28.954721395 +0000 UTC m=+0.126818037 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:05:28 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:05:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:05:30 np0005626463.localdomain podman[64265]: 2026-02-23 08:05:30.903806731 +0000 UTC m=+0.081796024 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:05:30 np0005626463.localdomain podman[64265]: 2026-02-23 08:05:30.91217636 +0000 UTC m=+0.090165623 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, vcs-type=git, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., tcib_managed=true)
Feb 23 08:05:30 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:05:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:05:38 np0005626463.localdomain podman[64286]: 2026-02-23 08:05:38.900339024 +0000 UTC m=+0.069875562 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.13)
Feb 23 08:05:39 np0005626463.localdomain podman[64286]: 2026-02-23 08:05:39.102236448 +0000 UTC m=+0.271772976 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 23 08:05:39 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:05:44 np0005626463.localdomain sshd[64315]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:05:45 np0005626463.localdomain sshd[64315]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:05:56 np0005626463.localdomain sshd[64317]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:05:57 np0005626463.localdomain sshd[64317]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:05:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:05:59 np0005626463.localdomain podman[64319]: 2026-02-23 08:05:59.90731125 +0000 UTC m=+0.083071574 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 23 08:05:59 np0005626463.localdomain podman[64319]: 2026-02-23 08:05:59.921304258 +0000 UTC m=+0.097064602 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, url=https://www.redhat.com)
Feb 23 08:05:59 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:06:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:06:01 np0005626463.localdomain systemd[1]: tmp-crun.nd9wS2.mount: Deactivated successfully.
Feb 23 08:06:01 np0005626463.localdomain podman[64340]: 2026-02-23 08:06:01.907733652 +0000 UTC m=+0.083600302 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git)
Feb 23 08:06:01 np0005626463.localdomain podman[64340]: 2026-02-23 08:06:01.91918183 +0000 UTC m=+0.095048500 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 23 08:06:01 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:06:06 np0005626463.localdomain sudo[64359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:06:06 np0005626463.localdomain sudo[64359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:06:06 np0005626463.localdomain sudo[64359]: pam_unix(sudo:session): session closed for user root
Feb 23 08:06:06 np0005626463.localdomain sudo[64374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:06:06 np0005626463.localdomain sudo[64374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:06:07 np0005626463.localdomain sudo[64374]: pam_unix(sudo:session): session closed for user root
Feb 23 08:06:07 np0005626463.localdomain sudo[64420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:06:07 np0005626463.localdomain sudo[64420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:06:07 np0005626463.localdomain sudo[64420]: pam_unix(sudo:session): session closed for user root
Feb 23 08:06:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:06:09 np0005626463.localdomain podman[64435]: 2026-02-23 08:06:09.907283421 +0000 UTC m=+0.080644247 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:06:10 np0005626463.localdomain podman[64435]: 2026-02-23 08:06:10.109416943 +0000 UTC m=+0.282777739 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510)
Feb 23 08:06:10 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:06:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:06:30 np0005626463.localdomain podman[64465]: 2026-02-23 08:06:30.913790052 +0000 UTC m=+0.087031962 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, container_name=collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5)
Feb 23 08:06:30 np0005626463.localdomain podman[64465]: 2026-02-23 08:06:30.923605056 +0000 UTC m=+0.096847016 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git)
Feb 23 08:06:30 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:06:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:06:32 np0005626463.localdomain systemd[1]: tmp-crun.sE8eI4.mount: Deactivated successfully.
Feb 23 08:06:32 np0005626463.localdomain podman[64486]: 2026-02-23 08:06:32.913341945 +0000 UTC m=+0.088608032 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:06:32 np0005626463.localdomain podman[64486]: 2026-02-23 08:06:32.952363517 +0000 UTC m=+0.127629604 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, release=1766032510, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:06:32 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:06:37 np0005626463.localdomain sshd[64506]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:06:37 np0005626463.localdomain sshd[64506]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:06:39 np0005626463.localdomain sshd[64508]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:06:39 np0005626463.localdomain sshd[64508]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:06:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:06:40 np0005626463.localdomain systemd[1]: tmp-crun.QuEhI8.mount: Deactivated successfully.
Feb 23 08:06:40 np0005626463.localdomain podman[64510]: 2026-02-23 08:06:40.910813735 +0000 UTC m=+0.085533121 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:06:41 np0005626463.localdomain podman[64510]: 2026-02-23 08:06:41.119375215 +0000 UTC m=+0.294094591 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 23 08:06:41 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:07:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:07:01 np0005626463.localdomain systemd[1]: tmp-crun.ZE94jW.mount: Deactivated successfully.
Feb 23 08:07:01 np0005626463.localdomain podman[64539]: 2026-02-23 08:07:01.921593648 +0000 UTC m=+0.094842492 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd)
Feb 23 08:07:01 np0005626463.localdomain podman[64539]: 2026-02-23 08:07:01.933020756 +0000 UTC m=+0.106269630 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git)
Feb 23 08:07:01 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:07:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:07:03 np0005626463.localdomain podman[64558]: 2026-02-23 08:07:03.910694376 +0000 UTC m=+0.080996175 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public)
Feb 23 08:07:03 np0005626463.localdomain podman[64558]: 2026-02-23 08:07:03.945482609 +0000 UTC m=+0.115784478 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:07:03 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:07:08 np0005626463.localdomain sudo[64577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:07:08 np0005626463.localdomain sudo[64577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:07:08 np0005626463.localdomain sudo[64577]: pam_unix(sudo:session): session closed for user root
Feb 23 08:07:08 np0005626463.localdomain sudo[64592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:07:08 np0005626463.localdomain sudo[64592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:07:08 np0005626463.localdomain sudo[64592]: pam_unix(sudo:session): session closed for user root
Feb 23 08:07:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:07:11 np0005626463.localdomain podman[64639]: 2026-02-23 08:07:11.913983285 +0000 UTC m=+0.083954230 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:07:12 np0005626463.localdomain podman[64639]: 2026-02-23 08:07:12.131583927 +0000 UTC m=+0.301554892 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:07:12 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:07:12 np0005626463.localdomain sudo[64668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:07:12 np0005626463.localdomain sudo[64668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:07:12 np0005626463.localdomain sudo[64668]: pam_unix(sudo:session): session closed for user root
Feb 23 08:07:17 np0005626463.localdomain sshd[64683]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:07:18 np0005626463.localdomain sshd[64683]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:07:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:07:32 np0005626463.localdomain podman[64685]: 2026-02-23 08:07:32.913934248 +0000 UTC m=+0.088100263 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 23 08:07:32 np0005626463.localdomain podman[64685]: 2026-02-23 08:07:32.952282376 +0000 UTC m=+0.126448411 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, release=1766032510, container_name=collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com)
Feb 23 08:07:32 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:07:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:07:34 np0005626463.localdomain podman[64706]: 2026-02-23 08:07:34.905547269 +0000 UTC m=+0.082106770 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 23 08:07:34 np0005626463.localdomain podman[64706]: 2026-02-23 08:07:34.944277999 +0000 UTC m=+0.120837490 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3)
Feb 23 08:07:34 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:07:35 np0005626463.localdomain sshd[64726]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:07:35 np0005626463.localdomain sshd[64726]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:07:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:07:42 np0005626463.localdomain podman[64728]: 2026-02-23 08:07:42.917150438 +0000 UTC m=+0.095209683 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:07:43 np0005626463.localdomain podman[64728]: 2026-02-23 08:07:43.14841286 +0000 UTC m=+0.326472115 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:07:43 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:07:59 np0005626463.localdomain sshd[64758]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:08:00 np0005626463.localdomain sshd[64758]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:08:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:08:03 np0005626463.localdomain podman[64760]: 2026-02-23 08:08:03.913445782 +0000 UTC m=+0.082886425 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-type=git, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com)
Feb 23 08:08:03 np0005626463.localdomain podman[64760]: 2026-02-23 08:08:03.921782811 +0000 UTC m=+0.091223464 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, url=https://www.redhat.com)
Feb 23 08:08:03 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:08:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:08:05 np0005626463.localdomain podman[64780]: 2026-02-23 08:08:05.907349796 +0000 UTC m=+0.083720373 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc.)
Feb 23 08:08:05 np0005626463.localdomain podman[64780]: 2026-02-23 08:08:05.920259482 +0000 UTC m=+0.096630019 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:08:05 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:08:12 np0005626463.localdomain sudo[64800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:08:12 np0005626463.localdomain sudo[64800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:08:12 np0005626463.localdomain sudo[64800]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:12 np0005626463.localdomain sudo[64815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:08:12 np0005626463.localdomain sudo[64815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:08:13 np0005626463.localdomain sudo[64815]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:13 np0005626463.localdomain sudo[64863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:08:13 np0005626463.localdomain sudo[64863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:08:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:08:13 np0005626463.localdomain sudo[64863]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:13 np0005626463.localdomain sudo[64879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 08:08:13 np0005626463.localdomain sudo[64879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:08:13 np0005626463.localdomain podman[64878]: 2026-02-23 08:08:13.897016646 +0000 UTC m=+0.094600644 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:08:14 np0005626463.localdomain podman[64878]: 2026-02-23 08:08:14.094145296 +0000 UTC m=+0.291729204 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, distribution-scope=public, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:08:14 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:08:14 np0005626463.localdomain sudo[64879]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:18 np0005626463.localdomain sudo[64941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:08:18 np0005626463.localdomain sudo[64941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:08:18 np0005626463.localdomain sudo[64941]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:30 np0005626463.localdomain sshd[64957]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:08:32 np0005626463.localdomain sshd[64957]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:08:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:08:34 np0005626463.localdomain systemd[1]: tmp-crun.uwi1pF.mount: Deactivated successfully.
Feb 23 08:08:34 np0005626463.localdomain podman[64959]: 2026-02-23 08:08:34.910585636 +0000 UTC m=+0.080133767 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container)
Feb 23 08:08:34 np0005626463.localdomain podman[64959]: 2026-02-23 08:08:34.922722887 +0000 UTC m=+0.092271038 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:10:15Z, container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 23 08:08:34 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:08:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:08:36 np0005626463.localdomain podman[64980]: 2026-02-23 08:08:36.916762856 +0000 UTC m=+0.093009262 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:08:36 np0005626463.localdomain podman[64980]: 2026-02-23 08:08:36.952366595 +0000 UTC m=+0.128613031 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:08:36 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:08:40 np0005626463.localdomain sshd[64999]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:08:41 np0005626463.localdomain sshd[64999]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:08:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:08:44 np0005626463.localdomain systemd[1]: tmp-crun.3ykbQh.mount: Deactivated successfully.
Feb 23 08:08:44 np0005626463.localdomain podman[65001]: 2026-02-23 08:08:44.912745099 +0000 UTC m=+0.085795067 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr)
Feb 23 08:08:45 np0005626463.localdomain podman[65001]: 2026-02-23 08:08:45.105911147 +0000 UTC m=+0.278961075 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Feb 23 08:08:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:08:50 np0005626463.localdomain sudo[65076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfutabdodaizecjedhhxtovpezbyreqm ; /usr/bin/python3
Feb 23 08:08:50 np0005626463.localdomain sudo[65076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:50 np0005626463.localdomain python3[65078]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:08:50 np0005626463.localdomain sudo[65076]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:50 np0005626463.localdomain sudo[65121]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enphkplrwtpjcdbtvyslxioguwgooafk ; /usr/bin/python3
Feb 23 08:08:50 np0005626463.localdomain sudo[65121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:50 np0005626463.localdomain python3[65123]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834130.036081-107933-2862644198044/source _original_basename=tmp3dlyj6vc follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:08:50 np0005626463.localdomain sudo[65121]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:51 np0005626463.localdomain sudo[65183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqiyqprfovzmhxkcuwbsykqvbvpncrfx ; /usr/bin/python3
Feb 23 08:08:51 np0005626463.localdomain sudo[65183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:52 np0005626463.localdomain python3[65185]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:08:52 np0005626463.localdomain sudo[65183]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:52 np0005626463.localdomain sudo[65226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kddkuseicscycimjezmrdokdsyecfzby ; /usr/bin/python3
Feb 23 08:08:52 np0005626463.localdomain sudo[65226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:52 np0005626463.localdomain python3[65228]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834131.7460716-108020-175022870004861/source _original_basename=tmpgzuvoap0 follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:08:52 np0005626463.localdomain sudo[65226]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:52 np0005626463.localdomain sudo[65288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-notcagklqrqrrmdgjnlznljzslwtjlqm ; /usr/bin/python3
Feb 23 08:08:52 np0005626463.localdomain sudo[65288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:53 np0005626463.localdomain python3[65290]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:08:53 np0005626463.localdomain sudo[65288]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:53 np0005626463.localdomain sudo[65331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shqxmmvbahvjwrdvxykofoechqylynfl ; /usr/bin/python3
Feb 23 08:08:53 np0005626463.localdomain sudo[65331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:53 np0005626463.localdomain python3[65333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834132.7300253-108074-181335551316782/source _original_basename=tmpiy7e7gqm follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:08:53 np0005626463.localdomain sudo[65331]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:53 np0005626463.localdomain sudo[65393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weczxzazieripbgslaakkasjlnmiquxb ; /usr/bin/python3
Feb 23 08:08:53 np0005626463.localdomain sudo[65393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:54 np0005626463.localdomain python3[65395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:08:54 np0005626463.localdomain sudo[65393]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:54 np0005626463.localdomain sudo[65436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgbvrhkxrkhuiavsbksnebghfykglobx ; /usr/bin/python3
Feb 23 08:08:54 np0005626463.localdomain sudo[65436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:54 np0005626463.localdomain python3[65438]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834133.7447448-108147-246640380453297/source _original_basename=tmpm63ch93t follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:08:54 np0005626463.localdomain sudo[65436]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:54 np0005626463.localdomain sudo[65466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wekwjkuqpflhcoivvbkoysxxtjxxphbw ; /usr/bin/python3
Feb 23 08:08:54 np0005626463.localdomain sudo[65466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:55 np0005626463.localdomain python3[65468]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 08:08:55 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:55 np0005626463.localdomain systemd-rc-local-generator[65489]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:55 np0005626463.localdomain systemd-sysv-generator[65493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:55 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:55 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:55 np0005626463.localdomain systemd-sysv-generator[65535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:55 np0005626463.localdomain systemd-rc-local-generator[65529]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:55 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:55 np0005626463.localdomain sudo[65466]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:55 np0005626463.localdomain sudo[65556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbxywlvkplearyvemsqzkxggimhvdmyb ; /usr/bin/python3
Feb 23 08:08:55 np0005626463.localdomain sudo[65556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:56 np0005626463.localdomain python3[65558]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:08:56 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:56 np0005626463.localdomain systemd-rc-local-generator[65580]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:56 np0005626463.localdomain systemd-sysv-generator[65586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:56 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:56 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:56 np0005626463.localdomain systemd-rc-local-generator[65621]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:56 np0005626463.localdomain systemd-sysv-generator[65627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:56 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:56 np0005626463.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Feb 23 08:08:56 np0005626463.localdomain sudo[65556]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:57 np0005626463.localdomain sudo[65647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mndjhbctylvrbftusqgpidibhuyzbxih ; /usr/bin/python3
Feb 23 08:08:57 np0005626463.localdomain sudo[65647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:57 np0005626463.localdomain python3[65649]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:08:57 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:57 np0005626463.localdomain systemd-sysv-generator[65677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:57 np0005626463.localdomain systemd-rc-local-generator[65674]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:57 np0005626463.localdomain sudo[65647]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:58 np0005626463.localdomain sudo[65732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdlrcajmowmgjldxosgibzaqyamqlbwj ; /usr/bin/python3
Feb 23 08:08:58 np0005626463.localdomain sudo[65732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:58 np0005626463.localdomain python3[65734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:08:58 np0005626463.localdomain sudo[65732]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:58 np0005626463.localdomain sudo[65775]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wonbnmdfscnzwwghiyfavulsxyffxkez ; /usr/bin/python3
Feb 23 08:08:58 np0005626463.localdomain sudo[65775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:58 np0005626463.localdomain python3[65777]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834137.842005-108308-8662771612812/source _original_basename=tmpdo6uhayd follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:08:58 np0005626463.localdomain sudo[65775]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:58 np0005626463.localdomain sudo[65805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivzbkbqhfixbptplbhuoysjnebkrthjz ; /usr/bin/python3
Feb 23 08:08:58 np0005626463.localdomain sudo[65805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:59 np0005626463.localdomain python3[65807]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:08:59 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:08:59 np0005626463.localdomain systemd-sysv-generator[65837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:08:59 np0005626463.localdomain systemd-rc-local-generator[65834]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:08:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:08:59 np0005626463.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Feb 23 08:08:59 np0005626463.localdomain sudo[65805]: pam_unix(sudo:session): session closed for user root
Feb 23 08:08:59 np0005626463.localdomain sudo[65860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbkpttrwjfcmrgcfxtiqaubjarnashrx ; /usr/bin/python3
Feb 23 08:08:59 np0005626463.localdomain sudo[65860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:08:59 np0005626463.localdomain python3[65862]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:08:59 np0005626463.localdomain sudo[65860]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:00 np0005626463.localdomain sudo[65910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mactmulvxwausdzjtyfoopkukxnhczbo ; /usr/bin/python3
Feb 23 08:09:00 np0005626463.localdomain sudo[65910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:00 np0005626463.localdomain sudo[65910]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:00 np0005626463.localdomain sudo[65928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkgucpnsckhttcuxtgaaaoxqffisgtek ; /usr/bin/python3
Feb 23 08:09:00 np0005626463.localdomain sudo[65928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:00 np0005626463.localdomain sudo[65928]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:01 np0005626463.localdomain sudo[66032]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktpuovgvgbnaqmqbalffbbidwdouscnm ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834140.9264715-108392-75319085338427/async_wrapper.py 133668917751 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834140.9264715-108392-75319085338427/AnsiballZ_command.py _
Feb 23 08:09:01 np0005626463.localdomain sudo[66032]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 08:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4530 writes, 20K keys, 4530 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4530 writes, 464 syncs, 9.76 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 215 writes, 495 keys, 215 commit groups, 1.0 writes per commit group, ingest: 0.42 MB, 0.00 MB/s
                                                          Interval WAL: 215 writes, 106 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:09:01 np0005626463.localdomain ansible-async_wrapper.py[66034]: Invoked with 133668917751 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834140.9264715-108392-75319085338427/AnsiballZ_command.py _
Feb 23 08:09:01 np0005626463.localdomain ansible-async_wrapper.py[66037]: Starting module and watcher
Feb 23 08:09:01 np0005626463.localdomain ansible-async_wrapper.py[66037]: Start watching 66038 (3600)
Feb 23 08:09:01 np0005626463.localdomain ansible-async_wrapper.py[66038]: Start module (66038)
Feb 23 08:09:01 np0005626463.localdomain ansible-async_wrapper.py[66034]: Return async_wrapper task started.
Feb 23 08:09:01 np0005626463.localdomain sudo[66032]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:01 np0005626463.localdomain sudo[66056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofqqdnbtmpffsofrtdzxurezctohnfla ; /usr/bin/python3
Feb 23 08:09:01 np0005626463.localdomain sudo[66056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:01 np0005626463.localdomain python3[66058]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:09:01 np0005626463.localdomain sudo[66056]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (file: /etc/puppet/hiera.yaml)
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: Undefined variable '::deploy_config_name';
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (file & line not available)
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (file & line not available)
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 23 08:09:05 np0005626463.localdomain puppet-user[66045]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.23 seconds
Feb 23 08:09:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:09:05 np0005626463.localdomain podman[66174]: 2026-02-23 08:09:05.929717017 +0000 UTC m=+0.096912086 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true)
Feb 23 08:09:05 np0005626463.localdomain podman[66174]: 2026-02-23 08:09:05.943481846 +0000 UTC m=+0.110676955 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:09:05 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 5013 writes, 22K keys, 5013 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5013 writes, 561 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 146 writes, 382 keys, 146 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s
                                                          Interval WAL: 146 writes, 72 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:09:06 np0005626463.localdomain ansible-async_wrapper.py[66037]: 66038 still running (3600)
Feb 23 08:09:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:09:07 np0005626463.localdomain systemd[1]: tmp-crun.kjmOyP.mount: Deactivated successfully.
Feb 23 08:09:07 np0005626463.localdomain podman[66197]: 2026-02-23 08:09:07.894991537 +0000 UTC m=+0.076699024 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:09:07 np0005626463.localdomain podman[66197]: 2026-02-23 08:09:07.903818592 +0000 UTC m=+0.085526109 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:09:07 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:09:11 np0005626463.localdomain ansible-async_wrapper.py[66037]: 66038 still running (3595)
Feb 23 08:09:12 np0005626463.localdomain sudo[66282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-schfjsshgonnrobfywxfbdnkoieiplre ; /usr/bin/python3
Feb 23 08:09:12 np0005626463.localdomain sudo[66282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:12 np0005626463.localdomain python3[66286]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:09:12 np0005626463.localdomain sudo[66282]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:14 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 08:09:14 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 08:09:14 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:14 np0005626463.localdomain systemd-sysv-generator[66393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:14 np0005626463.localdomain systemd-rc-local-generator[66388]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:14 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.282s CPU time.
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: run-r66cda147f9a64a7fb4a68f0618c030d7.service: Deactivated successfully.
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: tmp-crun.tbCC8h.mount: Deactivated successfully.
Feb 23 08:09:15 np0005626463.localdomain podman[67389]: 2026-02-23 08:09:15.578128328 +0000 UTC m=+0.096653637 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5)
Feb 23 08:09:15 np0005626463.localdomain podman[67389]: 2026-02-23 08:09:15.797503793 +0000 UTC m=+0.316029162 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:09:15 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}aa026ec7dfd630c335e361683a845a48dbc161e201acd6e8ba3a46c1ecc947e6'
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Feb 23 08:09:16 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Feb 23 08:09:16 np0005626463.localdomain ansible-async_wrapper.py[66037]: 66038 still running (3590)
Feb 23 08:09:18 np0005626463.localdomain sudo[67429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:09:18 np0005626463.localdomain sudo[67429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:18 np0005626463.localdomain sudo[67429]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:19 np0005626463.localdomain sudo[67444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 08:09:19 np0005626463.localdomain sudo[67444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:19 np0005626463.localdomain sudo[67444]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:19 np0005626463.localdomain sudo[67479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:09:19 np0005626463.localdomain sudo[67479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:19 np0005626463.localdomain sudo[67479]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:19 np0005626463.localdomain sudo[67494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:09:19 np0005626463.localdomain sudo[67494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:20 np0005626463.localdomain sudo[67494]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:20 np0005626463.localdomain sudo[67541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:09:20 np0005626463.localdomain sudo[67541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:20 np0005626463.localdomain sudo[67541]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:20 np0005626463.localdomain sudo[67556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 08:09:20 np0005626463.localdomain sudo[67556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 
Feb 23 08:09:21 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.275378476 +0000 UTC m=+0.081148303 container create 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public)
Feb 23 08:09:21 np0005626463.localdomain sshd[67627]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Started libpod-conmon-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope.
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.242655225 +0000 UTC m=+0.048425082 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.360194972 +0000 UTC m=+0.165964799 container init 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, distribution-scope=public, RELEASE=main)
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.372227257 +0000 UTC m=+0.177997104 container start 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, GIT_CLEAN=True, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.372524747 +0000 UTC m=+0.178294574 container attach 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Feb 23 08:09:21 np0005626463.localdomain stupefied_turing[67632]: 167 167
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: libpod-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope: Deactivated successfully.
Feb 23 08:09:21 np0005626463.localdomain podman[67612]: 2026-02-23 08:09:21.37552639 +0000 UTC m=+0.181296247 container died 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, name=rhceph, vendor=Red Hat, Inc.)
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:21 np0005626463.localdomain systemd-sysv-generator[67680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:21 np0005626463.localdomain systemd-rc-local-generator[67673]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:21 np0005626463.localdomain ansible-async_wrapper.py[66037]: 66038 still running (3585)
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:21 np0005626463.localdomain podman[67639]: 2026-02-23 08:09:21.747362883 +0000 UTC m=+0.354144431 container remove 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.expose-services=, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main)
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: libpod-conmon-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope: Deactivated successfully.
Feb 23 08:09:21 np0005626463.localdomain snmpd[67690]: Can't find directory of RPM packages
Feb 23 08:09:21 np0005626463.localdomain snmpd[67690]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Feb 23 08:09:21 np0005626463.localdomain sshd[67627]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:09:21 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:21 np0005626463.localdomain podman[67700]: 
Feb 23 08:09:21 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:21.989955172 +0000 UTC m=+0.085918522 container create 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git)
Feb 23 08:09:21 np0005626463.localdomain systemd-rc-local-generator[67735]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:21 np0005626463.localdomain systemd-sysv-generator[67740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:22 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:21.953552466 +0000 UTC m=+0.049515846 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-fade5e80820262dc4cd0d76f1c0aef2432ea8e922812ec0df05298c03d71e744-merged.mount: Deactivated successfully.
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: Started libpod-conmon-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope.
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:22 np0005626463.localdomain systemd-rc-local-generator[67776]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:22 np0005626463.localdomain systemd-sysv-generator[67780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:22 np0005626463.localdomain sudo[67802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raromtvokhmzqlncqnbjvunowwqwhyud ; /usr/bin/python3
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:22 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:22 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:22 np0005626463.localdomain sudo[67802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:22 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Notice: Applied catalog in 16.95 seconds
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Application:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:    Initial environment: production
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:    Converged environment: production
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:          Run mode: user
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Changes:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:             Total: 8
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Events:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:           Success: 8
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:             Total: 8
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Resources:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:         Restarted: 1
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:           Changed: 8
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:       Out of sync: 8
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:             Total: 19
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Time:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:        Filebucket: 0.00
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:          Schedule: 0.00
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:            Augeas: 0.01
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:              File: 0.10
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:    Config retrieval: 0.30
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:           Service: 1.30
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:           Package: 10.21
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:    Transaction evaluation: 16.94
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:    Catalog application: 16.95
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:          Last run: 1771834162
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:              Exec: 5.10
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:             Total: 16.95
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]: Version:
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:            Config: 1771834145
Feb 23 08:09:22 np0005626463.localdomain puppet-user[66045]:            Puppet: 7.10.0
Feb 23 08:09:22 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:22.541994817 +0000 UTC m=+0.637958187 container init 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1770267347, architecture=x86_64, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 08:09:22 np0005626463.localdomain systemd[1]: tmp-crun.BES9Lf.mount: Deactivated successfully.
Feb 23 08:09:22 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:22.559933006 +0000 UTC m=+0.655896366 container start 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, name=rhceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Feb 23 08:09:22 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:22.560207245 +0000 UTC m=+0.656170605 container attach 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Feb 23 08:09:22 np0005626463.localdomain ansible-async_wrapper.py[66038]: Module complete (66038)
Feb 23 08:09:22 np0005626463.localdomain python3[67804]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:09:22 np0005626463.localdomain sudo[67802]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:23 np0005626463.localdomain sudo[68145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouvudsdxltarbgwsyxpsxxhdwlxjijsp ; /usr/bin/python3
Feb 23 08:09:23 np0005626463.localdomain sudo[68145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:23 np0005626463.localdomain python3[68340]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:09:23 np0005626463.localdomain sudo[68145]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]: [
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:     {
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "available": false,
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "ceph_device": false,
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "lsm_data": {},
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "lvs": [],
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "path": "/dev/sr0",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "rejected_reasons": [
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "Has a FileSystem",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "Insufficient space (<5GB)"
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         ],
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         "sys_api": {
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "actuators": null,
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "device_nodes": "sr0",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "human_readable_size": "482.00 KB",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "id_bus": "ata",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "model": "QEMU DVD-ROM",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "nr_requests": "2",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "partitions": {},
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "path": "/dev/sr0",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "removable": "1",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "rev": "2.5+",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "ro": "0",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "rotational": "1",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "sas_address": "",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "sas_device_handle": "",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "scheduler_mode": "mq-deadline",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "sectors": 0,
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "sectorsize": "2048",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "size": 493568.0,
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "support_discard": "0",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "type": "disk",
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:             "vendor": "QEMU"
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:         }
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]:     }
Feb 23 08:09:23 np0005626463.localdomain xenodochial_rhodes[67752]: ]
Feb 23 08:09:23 np0005626463.localdomain systemd[1]: libpod-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope: Deactivated successfully.
Feb 23 08:09:23 np0005626463.localdomain podman[67700]: 2026-02-23 08:09:23.516760662 +0000 UTC m=+1.612724002 container died 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, vcs-type=git, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-09T10:25:24Z)
Feb 23 08:09:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314-merged.mount: Deactivated successfully.
Feb 23 08:09:23 np0005626463.localdomain podman[69387]: 2026-02-23 08:09:23.60741468 +0000 UTC m=+0.082665520 container remove 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:09:23 np0005626463.localdomain systemd[1]: libpod-conmon-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope: Deactivated successfully.
Feb 23 08:09:23 np0005626463.localdomain sudo[69413]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlqkjnforimdygywxnbaitjamuslwqlt ; /usr/bin/python3
Feb 23 08:09:23 np0005626463.localdomain sudo[69413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:23 np0005626463.localdomain sudo[67556]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:23 np0005626463.localdomain python3[69415]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:23 np0005626463.localdomain sudo[69413]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:24 np0005626463.localdomain sudo[69418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:09:24 np0005626463.localdomain sudo[69418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:09:24 np0005626463.localdomain sudo[69418]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:24 np0005626463.localdomain sudo[69478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-curdnuvxutkpsoshubtstujlzictuuuo ; /usr/bin/python3
Feb 23 08:09:24 np0005626463.localdomain sudo[69478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:24 np0005626463.localdomain python3[69480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:24 np0005626463.localdomain sudo[69478]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:24 np0005626463.localdomain sudo[69496]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbzlimbjraluzemtfuyntylguvwaeoki ; /usr/bin/python3
Feb 23 08:09:24 np0005626463.localdomain sudo[69496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:24 np0005626463.localdomain python3[69498]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpxjnmgbdj recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:09:24 np0005626463.localdomain sudo[69496]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:24 np0005626463.localdomain sudo[69526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clfavywyvawjdkinfmzpywhvtsrdwetr ; /usr/bin/python3
Feb 23 08:09:24 np0005626463.localdomain sudo[69526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:25 np0005626463.localdomain python3[69528]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:25 np0005626463.localdomain sudo[69526]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:25 np0005626463.localdomain sudo[69542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjfxdvbsqvopatmgikvwoztwvncxocdp ; /usr/bin/python3
Feb 23 08:09:25 np0005626463.localdomain sudo[69542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:25 np0005626463.localdomain sudo[69542]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:26 np0005626463.localdomain sudo[69629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycwcottizzgqcjinumwuovlsaidufqfu ; /usr/bin/python3
Feb 23 08:09:26 np0005626463.localdomain sudo[69629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:26 np0005626463.localdomain python3[69631]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 23 08:09:26 np0005626463.localdomain sudo[69629]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:26 np0005626463.localdomain ansible-async_wrapper.py[66037]: Done in kid B.
Feb 23 08:09:26 np0005626463.localdomain sudo[69648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnnrhxvryrmvxsvvyglqwylmoihvtugm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:26 np0005626463.localdomain sudo[69648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:26 np0005626463.localdomain python3[69650]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:26 np0005626463.localdomain sudo[69648]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:27 np0005626463.localdomain sudo[69664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbiwkyipnghitjzxhubbecedgrobwvgj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:27 np0005626463.localdomain sudo[69664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:27 np0005626463.localdomain sudo[69664]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:27 np0005626463.localdomain sudo[69680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpnofqdmrsmlswpzfrtnlfmcwztujcof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:27 np0005626463.localdomain sudo[69680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:27 np0005626463.localdomain python3[69682]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:27 np0005626463.localdomain sudo[69680]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:28 np0005626463.localdomain sshd[69722]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:09:28 np0005626463.localdomain sudo[69731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttpgxrluhlwthmewcvcvpffyqmqhylne ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:28 np0005626463.localdomain sudo[69731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:28 np0005626463.localdomain python3[69733]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:28 np0005626463.localdomain sudo[69731]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:28 np0005626463.localdomain sudo[69750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azdoieiklfmfaylfuquwnfygcdblmshd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:28 np0005626463.localdomain sudo[69750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:28 np0005626463.localdomain python3[69752]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:28 np0005626463.localdomain sudo[69750]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:29 np0005626463.localdomain sudo[69812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khevwwwfilvcmwwypwcgfanchhawkmww ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:29 np0005626463.localdomain sudo[69812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:29 np0005626463.localdomain python3[69814]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:29 np0005626463.localdomain sudo[69812]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:29 np0005626463.localdomain sudo[69830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyriznayksojbzejjvuhzxrtghwlcsvm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:29 np0005626463.localdomain sudo[69830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:29 np0005626463.localdomain python3[69832]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:29 np0005626463.localdomain sudo[69830]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:30 np0005626463.localdomain sudo[69893]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjsmiqprhbyckhhiaytqjtawgxtanbcz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:30 np0005626463.localdomain sudo[69893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:30 np0005626463.localdomain python3[69895]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:30 np0005626463.localdomain sudo[69893]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:30 np0005626463.localdomain sudo[69911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kejfzmhpusgbcsyqmberwlcmsatlhyng ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:30 np0005626463.localdomain sudo[69911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:30 np0005626463.localdomain python3[69913]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:30 np0005626463.localdomain sudo[69911]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:30 np0005626463.localdomain sudo[69973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aruswhxitkxfsykxyijlieggkszpykqm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:30 np0005626463.localdomain sudo[69973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:31 np0005626463.localdomain python3[69975]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:31 np0005626463.localdomain sudo[69973]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:31 np0005626463.localdomain sudo[69991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vohvpqyvorhqxgvaobbmofjabrarjsjy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:31 np0005626463.localdomain sudo[69991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:31 np0005626463.localdomain python3[69993]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:31 np0005626463.localdomain sudo[69991]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:31 np0005626463.localdomain sudo[70021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkwsoujrrvccfhidcyojvkcjoraublhu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:31 np0005626463.localdomain sudo[70021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:31 np0005626463.localdomain python3[70023]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:31 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:32 np0005626463.localdomain systemd-rc-local-generator[70049]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:32 np0005626463.localdomain systemd-sysv-generator[70053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:32 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:33 np0005626463.localdomain sudo[70021]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:33 np0005626463.localdomain sudo[70107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhdjpajdjieicelapjgmwteuvlxyoksp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:33 np0005626463.localdomain sudo[70107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:33 np0005626463.localdomain python3[70109]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:33 np0005626463.localdomain sudo[70107]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:33 np0005626463.localdomain sudo[70125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izoafzzrozmiuqxajuugcrxwihzesvhe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:33 np0005626463.localdomain sudo[70125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:34 np0005626463.localdomain python3[70127]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:34 np0005626463.localdomain sudo[70125]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:34 np0005626463.localdomain sudo[70187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plsxmccfviualenfjppzzmmdalbwcpsb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:34 np0005626463.localdomain sudo[70187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:34 np0005626463.localdomain python3[70189]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:09:35 np0005626463.localdomain sshd[69722]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:09:35 np0005626463.localdomain sudo[70187]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:35 np0005626463.localdomain sudo[70205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzsvbnkpyvcpkprugwelhvgxzbcnqoca ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:35 np0005626463.localdomain sudo[70205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:35 np0005626463.localdomain python3[70207]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:36 np0005626463.localdomain sudo[70205]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:36 np0005626463.localdomain sudo[70235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvisrrapbyqphqsyieihahonfkcypgiw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:36 np0005626463.localdomain sudo[70235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: tmp-crun.7s4xhL.mount: Deactivated successfully.
Feb 23 08:09:36 np0005626463.localdomain podman[70238]: 2026-02-23 08:09:36.398908224 +0000 UTC m=+0.100579789 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:09:36 np0005626463.localdomain podman[70238]: 2026-02-23 08:09:36.409553786 +0000 UTC m=+0.111225291 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:09:36 np0005626463.localdomain python3[70237]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:36 np0005626463.localdomain systemd-sysv-generator[70283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:36 np0005626463.localdomain systemd-rc-local-generator[70278]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 08:09:36 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 08:09:36 np0005626463.localdomain sudo[70235]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:37 np0005626463.localdomain sudo[70312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlsfnwounukmaebjktlnxknrfxanqkgq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:37 np0005626463.localdomain sudo[70312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:37 np0005626463.localdomain python3[70314]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 08:09:37 np0005626463.localdomain sudo[70312]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:37 np0005626463.localdomain sudo[70328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twhrlfdhaneurtcabmuhyzipbkrnbxbq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:37 np0005626463.localdomain sudo[70328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:38 np0005626463.localdomain sudo[70328]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:09:38 np0005626463.localdomain podman[70357]: 2026-02-23 08:09:38.913104663 +0000 UTC m=+0.080967038 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:09:38 np0005626463.localdomain podman[70357]: 2026-02-23 08:09:38.924749816 +0000 UTC m=+0.092612231 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Feb 23 08:09:38 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:09:39 np0005626463.localdomain sudo[70390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgcbpjfuekjeanhiaythllbasjmkuzch ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:39 np0005626463.localdomain sudo[70390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:39 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 08:09:39 np0005626463.localdomain podman[70562]: 2026-02-23 08:09:39.807662375 +0000 UTC m=+0.088820643 container create b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, architecture=x86_64, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:09:39 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:39.830458406 +0000 UTC m=+0.130459941 container create daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, build-date=2026-01-12T23:31:49Z, container_name=nova_libvirt_init_secret, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:09:39 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:39.753293609 +0000 UTC m=+0.034885670 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 08:09:39 np0005626463.localdomain podman[70604]: 2026-02-23 08:09:39.858122549 +0000 UTC m=+0.088603466 container create 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope.
Feb 23 08:09:39 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:39.76394818 +0000 UTC m=+0.063949735 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope.
Feb 23 08:09:39 np0005626463.localdomain podman[70562]: 2026-02-23 08:09:39.766075347 +0000 UTC m=+0.047233625 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 23 08:09:39 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:39.867684577 +0000 UTC m=+0.149276628 container create 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=configure_cms_options, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:36:40Z, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:39 np0005626463.localdomain podman[70585]: 2026-02-23 08:09:39.789813338 +0000 UTC m=+0.045221733 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope.
Feb 23 08:09:39 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:39.892480831 +0000 UTC m=+0.192482356 container init daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=nova_libvirt_init_secret, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:39 np0005626463.localdomain podman[70604]: 2026-02-23 08:09:39.802329108 +0000 UTC m=+0.032810035 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 23 08:09:39 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:39.903644699 +0000 UTC m=+0.203646234 container start daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true)
Feb 23 08:09:39 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:39.914971913 +0000 UTC m=+0.214973428 container attach daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5)
Feb 23 08:09:39 np0005626463.localdomain podman[70585]: 2026-02-23 08:09:39.916103089 +0000 UTC m=+0.171511474 container create 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope.
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85b58d6db47c08b0dc415e7676af05b270f66bddea6d8ca4f2d3998d7b04080d/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:09:39 np0005626463.localdomain podman[70562]: 2026-02-23 08:09:39.968555815 +0000 UTC m=+0.249714093 container init b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, container_name=logrotate_crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:09:39 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:39.970344771 +0000 UTC m=+0.251936812 container init 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 23 08:09:39 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:39.977936828 +0000 UTC m=+0.259528859 container start 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, container_name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 23 08:09:39 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:39.978262988 +0000 UTC m=+0.259855059 container attach 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=configure_cms_options, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:09:39 np0005626463.localdomain sudo[70662]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:09:39 np0005626463.localdomain sudo[70662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:09:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:09:40 np0005626463.localdomain podman[70562]: 2026-02-23 08:09:40.004382813 +0000 UTC m=+0.285541081 container start b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:09:40 np0005626463.localdomain podman[70585]: 2026-02-23 08:09:40.010952748 +0000 UTC m=+0.266361123 container init 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 23 08:09:40 np0005626463.localdomain sudo[70676]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:09:40 np0005626463.localdomain sudo[70676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: libpod-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain podman[70585]: 2026-02-23 08:09:40.050009217 +0000 UTC m=+0.305417602 container start 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:09:40 np0005626463.localdomain podman[70548]: 2026-02-23 08:09:40.051223754 +0000 UTC m=+0.351225279 container died daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_libvirt_init_secret, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=44281c742f88411d75916a4e58499720 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 23 08:09:40 np0005626463.localdomain sudo[70662]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:40 np0005626463.localdomain crond[70659]: (CRON) STARTUP (1.5.7)
Feb 23 08:09:40 np0005626463.localdomain crond[70659]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 67% if used.)
Feb 23 08:09:40 np0005626463.localdomain crond[70659]: (CRON) INFO (running with inotify support)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope.
Feb 23 08:09:40 np0005626463.localdomain ovs-vsctl[70713]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Feb 23 08:09:40 np0005626463.localdomain sudo[70676]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:40 np0005626463.localdomain podman[70559]: 2026-02-23 08:09:40.127330997 +0000 UTC m=+0.408923028 container died 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: libpod-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:40 np0005626463.localdomain podman[70665]: 2026-02-23 08:09:40.145929448 +0000 UTC m=+0.127113386 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:09:40 np0005626463.localdomain podman[70604]: 2026-02-23 08:09:40.155064023 +0000 UTC m=+0.385544950 container init 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:09:40 np0005626463.localdomain podman[70665]: 2026-02-23 08:09:40.15528846 +0000 UTC m=+0.136472408 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13)
Feb 23 08:09:40 np0005626463.localdomain sudo[70751]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:09:40 np0005626463.localdomain sudo[70751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:09:40 np0005626463.localdomain podman[70604]: 2026-02-23 08:09:40.183855152 +0000 UTC m=+0.414336069 container start 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=44281c742f88411d75916a4e58499720 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 23 08:09:40 np0005626463.localdomain podman[70697]: 2026-02-23 08:09:40.187248258 +0000 UTC m=+0.124850156 container cleanup daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_libvirt_init_secret, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:31:49Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: libpod-conmon-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain sudo[70751]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:40 np0005626463.localdomain podman[70687]: 2026-02-23 08:09:40.277216276 +0000 UTC m=+0.228760929 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 23 08:09:40 np0005626463.localdomain podman[70727]: 2026-02-23 08:09:40.299971206 +0000 UTC m=+0.157498356 container cleanup 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=configure_cms_options, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: libpod-conmon-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain podman[70760]: 2026-02-23 08:09:40.262117745 +0000 UTC m=+0.073825905 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Feb 23 08:09:40 np0005626463.localdomain podman[70760]: 2026-02-23 08:09:40.346144906 +0000 UTC m=+0.157853076 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:09:40 np0005626463.localdomain podman[70760]: unhealthy
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed with result 'exit-code'.
Feb 23 08:09:40 np0005626463.localdomain podman[70687]: 2026-02-23 08:09:40.361123164 +0000 UTC m=+0.312667827 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=)
Feb 23 08:09:40 np0005626463.localdomain podman[70687]: unhealthy
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'.
Feb 23 08:09:40 np0005626463.localdomain podman[70934]: 2026-02-23 08:09:40.693629479 +0000 UTC m=+0.085251091 container create 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=setup_ovs_manager)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:40 np0005626463.localdomain podman[70934]: 2026-02-23 08:09:40.648026076 +0000 UTC m=+0.039647748 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 08:09:40 np0005626463.localdomain podman[70934]: 2026-02-23 08:09:40.768734562 +0000 UTC m=+0.160356194 container init 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1)
Feb 23 08:09:40 np0005626463.localdomain podman[70934]: 2026-02-23 08:09:40.780408576 +0000 UTC m=+0.172030208 container start 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=setup_ovs_manager, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, build-date=2026-01-12T22:56:19Z, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Feb 23 08:09:40 np0005626463.localdomain podman[70934]: 2026-02-23 08:09:40.780658864 +0000 UTC m=+0.172280496 container attach 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com)
Feb 23 08:09:40 np0005626463.localdomain podman[70953]: 2026-02-23 08:09:40.810723793 +0000 UTC m=+0.151414506 container create 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope.
Feb 23 08:09:40 np0005626463.localdomain podman[70953]: 2026-02-23 08:09:40.753398384 +0000 UTC m=+0.094089117 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4752b195a0319d00ad3a4bd86f4312afcec268e914950a9934c95f1e8044f1fa/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:09:40 np0005626463.localdomain podman[70953]: 2026-02-23 08:09:40.904046364 +0000 UTC m=+0.244737077 container init 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-025f13926bcfeedaf7e085dde432a5009541a3e067b7031c72f0d516a81ad107-merged.mount: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd-userdata-shm.mount: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde-merged.mount: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d-userdata-shm.mount: Deactivated successfully.
Feb 23 08:09:40 np0005626463.localdomain sudo[70987]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:09:40 np0005626463.localdomain sudo[70987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:09:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:09:40 np0005626463.localdomain podman[70953]: 2026-02-23 08:09:40.966739061 +0000 UTC m=+0.307429754 container start 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:09:40 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:09:41 np0005626463.localdomain sudo[70987]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:41 np0005626463.localdomain sshd[71013]: Server listening on 0.0.0.0 port 2022.
Feb 23 08:09:41 np0005626463.localdomain sshd[71013]: Server listening on :: port 2022.
Feb 23 08:09:41 np0005626463.localdomain sudo[71018]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmp0ctxozx6/privsep.sock
Feb 23 08:09:41 np0005626463.localdomain sudo[71018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 23 08:09:41 np0005626463.localdomain podman[70989]: 2026-02-23 08:09:41.113327864 +0000 UTC m=+0.137848402 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:09:41 np0005626463.localdomain podman[70989]: 2026-02-23 08:09:41.475932519 +0000 UTC m=+0.500453117 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:09:41 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:09:41 np0005626463.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 23 08:09:41 np0005626463.localdomain sudo[71018]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:43 np0005626463.localdomain ovs-vsctl[71159]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 23 08:09:43 np0005626463.localdomain systemd[1]: libpod-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Deactivated successfully.
Feb 23 08:09:43 np0005626463.localdomain systemd[1]: libpod-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Consumed 2.846s CPU time.
Feb 23 08:09:43 np0005626463.localdomain podman[71160]: 2026-02-23 08:09:43.699922602 +0000 UTC m=+0.039688499 container died 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=setup_ovs_manager, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4)
Feb 23 08:09:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83-userdata-shm.mount: Deactivated successfully.
Feb 23 08:09:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6c9a1aacc166b15a11e7e7c477a5bec4f993243887d4de4680eae8258483d960-merged.mount: Deactivated successfully.
Feb 23 08:09:43 np0005626463.localdomain podman[71160]: 2026-02-23 08:09:43.751555763 +0000 UTC m=+0.091321610 container cleanup 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, release=1766032510, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 23 08:09:43 np0005626463.localdomain systemd[1]: libpod-conmon-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Deactivated successfully.
Feb 23 08:09:43 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Feb 23 08:09:44 np0005626463.localdomain podman[71273]: 2026-02-23 08:09:44.214276561 +0000 UTC m=+0.067095415 container create 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:09:44 np0005626463.localdomain podman[71274]: 2026-02-23 08:09:44.254597579 +0000 UTC m=+0.101569600 container create 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope.
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:44 np0005626463.localdomain podman[71273]: 2026-02-23 08:09:44.177172153 +0000 UTC m=+0.029991057 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope.
Feb 23 08:09:44 np0005626463.localdomain podman[71274]: 2026-02-23 08:09:44.208057797 +0000 UTC m=+0.055029848 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:09:44 np0005626463.localdomain podman[71273]: 2026-02-23 08:09:44.33410554 +0000 UTC m=+0.186924414 container init 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, vcs-type=git)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:09:44 np0005626463.localdomain podman[71274]: 2026-02-23 08:09:44.356381425 +0000 UTC m=+0.203353446 container init 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:09:44 np0005626463.localdomain sudo[71315]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:09:44 np0005626463.localdomain podman[71273]: 2026-02-23 08:09:44.382994585 +0000 UTC m=+0.235813459 container start 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 23 08:09:44 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:09:44 np0005626463.localdomain sudo[71315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 23 08:09:44 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 08:09:44 np0005626463.localdomain podman[71274]: 2026-02-23 08:09:44.429992902 +0000 UTC m=+0.276964923 container start 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team)
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 23 08:09:44 np0005626463.localdomain python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=cf62475d9880911ecf982eff6ab572ad --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 08:09:44 np0005626463.localdomain sudo[71315]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:09:44 np0005626463.localdomain podman[71316]: 2026-02-23 08:09:44.487998842 +0000 UTC m=+0.098229966 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, vcs-type=git, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 23 08:09:44 np0005626463.localdomain podman[71316]: 2026-02-23 08:09:44.571162596 +0000 UTC m=+0.181393680 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 23 08:09:44 np0005626463.localdomain podman[71316]: unhealthy
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Queued start job for default target Main User Target.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Created slice User Application Slice.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Reached target Paths.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Reached target Timers.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Starting D-Bus User Message Bus Socket...
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Starting Create User's Volatile Files and Directories...
Feb 23 08:09:44 np0005626463.localdomain podman[71333]: 2026-02-23 08:09:44.620776615 +0000 UTC m=+0.180362969 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Listening on D-Bus User Message Bus Socket.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Reached target Sockets.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Finished Create User's Volatile Files and Directories.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Reached target Basic System.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Reached target Main User Target.
Feb 23 08:09:44 np0005626463.localdomain systemd[71337]: Startup finished in 152ms.
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started User Manager for UID 0.
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: Started Session c9 of User root.
Feb 23 08:09:44 np0005626463.localdomain podman[71333]: 2026-02-23 08:09:44.663138396 +0000 UTC m=+0.222724760 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 23 08:09:44 np0005626463.localdomain podman[71333]: unhealthy
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:09:44 np0005626463.localdomain sudo[70390]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:44 np0005626463.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Feb 23 08:09:44 np0005626463.localdomain kernel: device br-int entered promiscuous mode
Feb 23 08:09:44 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834184.8065] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Feb 23 08:09:44 np0005626463.localdomain systemd-udevd[71428]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 08:09:45 np0005626463.localdomain sudo[71446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-foaoyhwkazwfljmckjskveguhbrreeto ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:45 np0005626463.localdomain sudo[71446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:45 np0005626463.localdomain python3[71448]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:45 np0005626463.localdomain sudo[71446]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:45 np0005626463.localdomain sudo[71462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muliazyvxaioromhqdbjawnuxjiqdtiv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:45 np0005626463.localdomain sudo[71462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:45 np0005626463.localdomain python3[71464]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:45 np0005626463.localdomain sudo[71462]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:45 np0005626463.localdomain sudo[71478]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdjrcmirnbtqwfbrlxgrhielezulqned ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:45 np0005626463.localdomain sudo[71478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:45 np0005626463.localdomain python3[71480]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:45 np0005626463.localdomain sudo[71478]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:45 np0005626463.localdomain sudo[71494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbhzmnrwxfvlvseqmuhovihlylfqecep ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:45 np0005626463.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Feb 23 08:09:45 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834185.8497] device (genev_sys_6081): carrier: link connected
Feb 23 08:09:45 np0005626463.localdomain systemd-udevd[71430]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 08:09:45 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834185.8502] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Feb 23 08:09:45 np0005626463.localdomain sudo[71494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:09:45 np0005626463.localdomain podman[71499]: 2026-02-23 08:09:45.990132922 +0000 UTC m=+0.093130097 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr)
Feb 23 08:09:46 np0005626463.localdomain python3[71500]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:46 np0005626463.localdomain sudo[71494]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:46 np0005626463.localdomain sudo[71541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzxyvgqablynnihgxcznrieaokzeszqy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:46 np0005626463.localdomain sudo[71541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:46 np0005626463.localdomain podman[71499]: 2026-02-23 08:09:46.173693019 +0000 UTC m=+0.276690154 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:09:46 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:09:46 np0005626463.localdomain sudo[71545]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp8bt8m_cy/privsep.sock
Feb 23 08:09:46 np0005626463.localdomain sudo[71545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 23 08:09:46 np0005626463.localdomain python3[71543]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:46 np0005626463.localdomain sudo[71541]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:46 np0005626463.localdomain sudo[71560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsousuyiofptmsrqwgoswvuoctudooyg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:46 np0005626463.localdomain sudo[71560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:46 np0005626463.localdomain python3[71563]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:46 np0005626463.localdomain sudo[71560]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:46 np0005626463.localdomain sudo[71577]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoyawswtvxfgclohrryseyshwdogultu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:46 np0005626463.localdomain sudo[71577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:46 np0005626463.localdomain python3[71579]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:46 np0005626463.localdomain sudo[71577]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:46 np0005626463.localdomain sudo[71545]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:46 np0005626463.localdomain sudo[71595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnjhshmqujpgyjadftmgsvmaulhrkmwh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:46 np0005626463.localdomain sudo[71595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:47 np0005626463.localdomain python3[71597]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:47 np0005626463.localdomain sudo[71595]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:47 np0005626463.localdomain sudo[71611]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lztzfyjnqbdmrutqpggydfronnmwkdkc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:47 np0005626463.localdomain sudo[71611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:47 np0005626463.localdomain python3[71615]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:47 np0005626463.localdomain sudo[71611]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:47 np0005626463.localdomain sudo[71629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ystvcmouvsrsdfpjbqwkvkfptglozmpb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:47 np0005626463.localdomain sudo[71629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:47 np0005626463.localdomain python3[71631]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:47 np0005626463.localdomain sudo[71629]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:47 np0005626463.localdomain sudo[71645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egitttpspqfdocvomxwyuxptwsydyecg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:47 np0005626463.localdomain sudo[71645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:47 np0005626463.localdomain python3[71647]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:47 np0005626463.localdomain sudo[71645]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:48 np0005626463.localdomain sudo[71661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxbraoihsdxpmizfxfziqivwbiwxdays ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:48 np0005626463.localdomain sudo[71661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:48 np0005626463.localdomain python3[71663]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:09:48 np0005626463.localdomain sudo[71661]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:48 np0005626463.localdomain sudo[71722]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-posjnvhwrrraiwxisrgcslrbcximdsbq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:48 np0005626463.localdomain sudo[71722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:48 np0005626463.localdomain python3[71724]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:48 np0005626463.localdomain sudo[71722]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:49 np0005626463.localdomain sudo[71751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcptjnzyftiavlgindjgtdlisjjfafyp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:49 np0005626463.localdomain sudo[71751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:49 np0005626463.localdomain python3[71753]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:49 np0005626463.localdomain sudo[71751]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:49 np0005626463.localdomain sudo[71780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rziimxpbhlsjhqexsdukjwmhxkmbdnhy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:49 np0005626463.localdomain sudo[71780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:49 np0005626463.localdomain python3[71782]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:49 np0005626463.localdomain sudo[71780]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:50 np0005626463.localdomain sudo[71809]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyolffhsbmxvbrvpwcxtfhkmsnrxrjkf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:50 np0005626463.localdomain sudo[71809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:50 np0005626463.localdomain python3[71811]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:50 np0005626463.localdomain sudo[71809]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:50 np0005626463.localdomain sudo[71838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajwdbzhbedlshatatfnihqizebrmyboh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:50 np0005626463.localdomain sudo[71838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:50 np0005626463.localdomain python3[71840]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:50 np0005626463.localdomain sudo[71838]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:51 np0005626463.localdomain sudo[71867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aulejltelldwcesgxfnkjjqollkakubj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:51 np0005626463.localdomain sudo[71867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:51 np0005626463.localdomain python3[71869]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:09:51 np0005626463.localdomain sudo[71867]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:51 np0005626463.localdomain sudo[71883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jojlcbkldyoowjkthktxxlzrlplcfeje ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:51 np0005626463.localdomain sudo[71883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:51 np0005626463.localdomain python3[71885]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 08:09:51 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:51 np0005626463.localdomain systemd-rc-local-generator[71908]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:51 np0005626463.localdomain systemd-sysv-generator[71914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:52 np0005626463.localdomain sudo[71883]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:52 np0005626463.localdomain sudo[71936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ousbvpyodeumdsarbfsgqsdoogdrsngo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:52 np0005626463.localdomain sudo[71936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:52 np0005626463.localdomain python3[71938]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:52 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:53 np0005626463.localdomain systemd-rc-local-generator[71965]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:53 np0005626463.localdomain systemd-sysv-generator[71970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:53 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:53 np0005626463.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 23 08:09:53 np0005626463.localdomain tripleo-start-podman-container[71978]: Creating additional drop-in dependency for "ceilometer_agent_compute" (68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9)
Feb 23 08:09:53 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:53 np0005626463.localdomain systemd-rc-local-generator[72033]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:53 np0005626463.localdomain systemd-sysv-generator[72037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:53 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:53 np0005626463.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 23 08:09:53 np0005626463.localdomain sudo[71936]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:54 np0005626463.localdomain sudo[72059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymojggpskttaqyyzeburaiurcyncdnnx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:54 np0005626463.localdomain sudo[72059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:54 np0005626463.localdomain python3[72061]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Activating special unit Exit the Session...
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped target Main User Target.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped target Basic System.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped target Paths.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped target Sockets.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped target Timers.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Closed D-Bus User Message Bus Socket.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Stopped Create User's Volatile Files and Directories.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Removed slice User Application Slice.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Reached target Shutdown.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Finished Exit the Session.
Feb 23 08:09:54 np0005626463.localdomain systemd[71337]: Reached target Exit the Session.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 08:09:54 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 23 08:09:55 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:55 np0005626463.localdomain systemd-rc-local-generator[72091]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:55 np0005626463.localdomain systemd-sysv-generator[72095]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:55 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:55 np0005626463.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Feb 23 08:09:56 np0005626463.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Feb 23 08:09:56 np0005626463.localdomain sudo[72059]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:56 np0005626463.localdomain sudo[72128]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbetgtyzjigwydosoqlrhgfbigdurvac ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:56 np0005626463.localdomain sudo[72128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:56 np0005626463.localdomain python3[72130]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:56 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:56 np0005626463.localdomain systemd-rc-local-generator[72158]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:56 np0005626463.localdomain systemd-sysv-generator[72161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:56 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:57 np0005626463.localdomain systemd[1]: Starting logrotate_crond container...
Feb 23 08:09:57 np0005626463.localdomain systemd[1]: Started logrotate_crond container.
Feb 23 08:09:57 np0005626463.localdomain sudo[72128]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:57 np0005626463.localdomain sudo[72195]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbzppomfyafrdgcvwipssxlishswosiy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:58 np0005626463.localdomain sudo[72195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:09:58 np0005626463.localdomain python3[72197]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:09:58 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:09:59 np0005626463.localdomain systemd-rc-local-generator[72220]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:09:59 np0005626463.localdomain systemd-sysv-generator[72224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:09:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:09:59 np0005626463.localdomain systemd[1]: Starting nova_migration_target container...
Feb 23 08:09:59 np0005626463.localdomain systemd[1]: Started nova_migration_target container.
Feb 23 08:09:59 np0005626463.localdomain sudo[72195]: pam_unix(sudo:session): session closed for user root
Feb 23 08:09:59 np0005626463.localdomain sudo[72261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jswbuhkccmaudhuuwaenxpaurhrjrqvd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:09:59 np0005626463.localdomain sudo[72261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:00 np0005626463.localdomain python3[72263]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:10:00 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:10:00 np0005626463.localdomain systemd-sysv-generator[72295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:10:00 np0005626463.localdomain systemd-rc-local-generator[72289]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:10:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:10:00 np0005626463.localdomain systemd[1]: Starting ovn_controller container...
Feb 23 08:10:00 np0005626463.localdomain tripleo-start-podman-container[72303]: Creating additional drop-in dependency for "ovn_controller" (1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e)
Feb 23 08:10:00 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:10:00 np0005626463.localdomain systemd-sysv-generator[72363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:10:00 np0005626463.localdomain systemd-rc-local-generator[72360]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:10:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:10:01 np0005626463.localdomain systemd[1]: Started ovn_controller container.
Feb 23 08:10:01 np0005626463.localdomain sudo[72261]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:01 np0005626463.localdomain sshd[72373]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:10:01 np0005626463.localdomain anacron[19051]: Job `cron.monthly' started
Feb 23 08:10:01 np0005626463.localdomain anacron[19051]: Job `cron.monthly' terminated
Feb 23 08:10:01 np0005626463.localdomain anacron[19051]: Normal exit (3 jobs run)
Feb 23 08:10:01 np0005626463.localdomain sudo[72390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afqspxhmztrlhwivkkgqsyvehsseidzr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:10:01 np0005626463.localdomain sudo[72390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:01 np0005626463.localdomain sshd[72373]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:10:01 np0005626463.localdomain python3[72392]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:10:01 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:10:01 np0005626463.localdomain systemd-rc-local-generator[72418]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:10:01 np0005626463.localdomain systemd-sysv-generator[72422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:10:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:10:02 np0005626463.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 23 08:10:02 np0005626463.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 23 08:10:02 np0005626463.localdomain sudo[72390]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:02 np0005626463.localdomain sudo[72471]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvsdptrndkbhygqxifcqwcbctbajdjbg ; /usr/bin/python3
Feb 23 08:10:02 np0005626463.localdomain sudo[72471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:02 np0005626463.localdomain python3[72473]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:10:02 np0005626463.localdomain sudo[72471]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:03 np0005626463.localdomain sudo[72520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztillstjkromwvpmtfpyzuscskrisjkr ; /usr/bin/python3
Feb 23 08:10:03 np0005626463.localdomain sudo[72520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:03 np0005626463.localdomain sudo[72520]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:03 np0005626463.localdomain sudo[72563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqlayojsrmnfaykclqrltwavbxjjarxc ; /usr/bin/python3
Feb 23 08:10:03 np0005626463.localdomain sudo[72563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:03 np0005626463.localdomain sudo[72563]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:03 np0005626463.localdomain sudo[72593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhwqhavykigfxqkramlkhwipnsxfxgnr ; /usr/bin/python3
Feb 23 08:10:04 np0005626463.localdomain sudo[72593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:04 np0005626463.localdomain python3[72595]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005626463 step=4 update_config_hash_only=False
Feb 23 08:10:04 np0005626463.localdomain sudo[72593]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:04 np0005626463.localdomain sudo[72609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxpijusrrbblhrnpfsxmfcdpfbivqvez ; /usr/bin/python3
Feb 23 08:10:04 np0005626463.localdomain sudo[72609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:04 np0005626463.localdomain python3[72611]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:10:04 np0005626463.localdomain sudo[72609]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:04 np0005626463.localdomain sudo[72625]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqtzongkyrfvusfqfnqonvunotgecmdw ; /usr/bin/python3
Feb 23 08:10:04 np0005626463.localdomain sudo[72625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:10:05 np0005626463.localdomain python3[72627]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 08:10:05 np0005626463.localdomain sudo[72625]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:10:06 np0005626463.localdomain podman[72628]: 2026-02-23 08:10:06.914907771 +0000 UTC m=+0.088795281 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 23 08:10:06 np0005626463.localdomain podman[72628]: 2026-02-23 08:10:06.927530916 +0000 UTC m=+0.101418436 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:10:06 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:10:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:10:09 np0005626463.localdomain podman[72650]: 2026-02-23 08:10:09.895721269 +0000 UTC m=+0.070305935 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, container_name=iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z)
Feb 23 08:10:09 np0005626463.localdomain podman[72650]: 2026-02-23 08:10:09.909364345 +0000 UTC m=+0.083949071 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510)
Feb 23 08:10:09 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:10:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:10:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:10:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:10:10 np0005626463.localdomain systemd[1]: tmp-crun.wWNCkC.mount: Deactivated successfully.
Feb 23 08:10:10 np0005626463.localdomain podman[72671]: 2026-02-23 08:10:10.897575439 +0000 UTC m=+0.073719181 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:10:10 np0005626463.localdomain podman[72673]: 2026-02-23 08:10:10.957262761 +0000 UTC m=+0.128020815 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, release=1766032510, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public)
Feb 23 08:10:10 np0005626463.localdomain podman[72672]: 2026-02-23 08:10:10.9213351 +0000 UTC m=+0.092024042 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., release=1766032510, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:10:10 np0005626463.localdomain podman[72671]: 2026-02-23 08:10:10.983376086 +0000 UTC m=+0.159519809 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510)
Feb 23 08:10:11 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:10:11 np0005626463.localdomain podman[72672]: 2026-02-23 08:10:11.004399212 +0000 UTC m=+0.175088084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z)
Feb 23 08:10:11 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:10:11 np0005626463.localdomain podman[72673]: 2026-02-23 08:10:11.041772368 +0000 UTC m=+0.212530372 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, container_name=logrotate_crond)
Feb 23 08:10:11 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:10:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:10:11 np0005626463.localdomain podman[72744]: 2026-02-23 08:10:11.886484755 +0000 UTC m=+0.064558025 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:10:12 np0005626463.localdomain podman[72744]: 2026-02-23 08:10:12.262190158 +0000 UTC m=+0.440263488 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1)
Feb 23 08:10:12 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:10:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:10:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:10:14 np0005626463.localdomain systemd[1]: tmp-crun.JxDMPP.mount: Deactivated successfully.
Feb 23 08:10:14 np0005626463.localdomain podman[72766]: 2026-02-23 08:10:14.914844547 +0000 UTC m=+0.093262901 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4)
Feb 23 08:10:14 np0005626463.localdomain podman[72766]: 2026-02-23 08:10:14.938151694 +0000 UTC m=+0.116570138 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, release=1766032510, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4)
Feb 23 08:10:14 np0005626463.localdomain systemd[1]: tmp-crun.2m37Cg.mount: Deactivated successfully.
Feb 23 08:10:14 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:10:14 np0005626463.localdomain podman[72767]: 2026-02-23 08:10:14.961134932 +0000 UTC m=+0.137435140 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public)
Feb 23 08:10:15 np0005626463.localdomain podman[72767]: 2026-02-23 08:10:15.032260851 +0000 UTC m=+0.208560979 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:10:15 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:10:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:10:16 np0005626463.localdomain podman[72811]: 2026-02-23 08:10:16.916318967 +0000 UTC m=+0.088163541 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible)
Feb 23 08:10:17 np0005626463.localdomain podman[72811]: 2026-02-23 08:10:17.131046248 +0000 UTC m=+0.302890882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:10:17 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:10:20 np0005626463.localdomain sshd[72841]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:10:21 np0005626463.localdomain sshd[72841]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:10:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 08:10:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 08:10:24 np0005626463.localdomain sudo[72843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:10:24 np0005626463.localdomain sudo[72843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:10:24 np0005626463.localdomain sudo[72843]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:24 np0005626463.localdomain sudo[72858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:10:24 np0005626463.localdomain sudo[72858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:10:25 np0005626463.localdomain sudo[72858]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:25 np0005626463.localdomain sudo[72905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:10:25 np0005626463.localdomain sudo[72905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:10:25 np0005626463.localdomain sudo[72905]: pam_unix(sudo:session): session closed for user root
Feb 23 08:10:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:10:37 np0005626463.localdomain systemd[1]: tmp-crun.ZAAYEN.mount: Deactivated successfully.
Feb 23 08:10:37 np0005626463.localdomain podman[72920]: 2026-02-23 08:10:37.941289754 +0000 UTC m=+0.108054943 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Feb 23 08:10:37 np0005626463.localdomain podman[72920]: 2026-02-23 08:10:37.957488449 +0000 UTC m=+0.124253688 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13)
Feb 23 08:10:37 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:10:40 np0005626463.localdomain sshd[72942]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:10:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:10:40 np0005626463.localdomain sshd[72942]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:10:40 np0005626463.localdomain podman[72944]: 2026-02-23 08:10:40.919451468 +0000 UTC m=+0.094274393 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:10:40 np0005626463.localdomain podman[72944]: 2026-02-23 08:10:40.938076569 +0000 UTC m=+0.112899504 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:10:40 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:10:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:10:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:10:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:10:41 np0005626463.localdomain podman[72965]: 2026-02-23 08:10:41.909208841 +0000 UTC m=+0.083218458 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true)
Feb 23 08:10:41 np0005626463.localdomain podman[72965]: 2026-02-23 08:10:41.942211531 +0000 UTC m=+0.116221188 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 23 08:10:41 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:10:41 np0005626463.localdomain podman[72964]: 2026-02-23 08:10:41.965468136 +0000 UTC m=+0.141565357 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:10:42 np0005626463.localdomain podman[72966]: 2026-02-23 08:10:42.010923914 +0000 UTC m=+0.180788711 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:10:42 np0005626463.localdomain podman[72966]: 2026-02-23 08:10:42.019607765 +0000 UTC m=+0.189472532 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:10:42 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:10:42 np0005626463.localdomain podman[72964]: 2026-02-23 08:10:42.069902135 +0000 UTC m=+0.245999306 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:10:42 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:10:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:10:42 np0005626463.localdomain podman[73035]: 2026-02-23 08:10:42.913613881 +0000 UTC m=+0.087424920 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 23 08:10:43 np0005626463.localdomain podman[73035]: 2026-02-23 08:10:43.331340965 +0000 UTC m=+0.505152044 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.buildah.version=1.41.5)
Feb 23 08:10:43 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:10:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:10:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:10:45 np0005626463.localdomain podman[73059]: 2026-02-23 08:10:45.912855003 +0000 UTC m=+0.091569808 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller)
Feb 23 08:10:45 np0005626463.localdomain podman[73060]: 2026-02-23 08:10:45.954926535 +0000 UTC m=+0.130925185 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:10:45 np0005626463.localdomain podman[73059]: 2026-02-23 08:10:45.965391352 +0000 UTC m=+0.144106157 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510)
Feb 23 08:10:45 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:10:46 np0005626463.localdomain podman[73060]: 2026-02-23 08:10:46.022254786 +0000 UTC m=+0.198253366 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, tcib_managed=true, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=)
Feb 23 08:10:46 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:10:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:10:47 np0005626463.localdomain systemd[1]: tmp-crun.ELZPGa.mount: Deactivated successfully.
Feb 23 08:10:47 np0005626463.localdomain podman[73107]: 2026-02-23 08:10:47.935603867 +0000 UTC m=+0.092982812 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:10:48 np0005626463.localdomain podman[73107]: 2026-02-23 08:10:48.13533385 +0000 UTC m=+0.292712785 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 23 08:10:48 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:11:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:11:08 np0005626463.localdomain podman[73137]: 2026-02-23 08:11:08.905444756 +0000 UTC m=+0.081004532 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, distribution-scope=public)
Feb 23 08:11:08 np0005626463.localdomain podman[73137]: 2026-02-23 08:11:08.919234284 +0000 UTC m=+0.094794060 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, container_name=collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, distribution-scope=public, batch=17.1_20260112.1)
Feb 23 08:11:08 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:11:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:11:11 np0005626463.localdomain systemd[1]: tmp-crun.MANFgF.mount: Deactivated successfully.
Feb 23 08:11:11 np0005626463.localdomain podman[73157]: 2026-02-23 08:11:11.9425808 +0000 UTC m=+0.077789450 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1)
Feb 23 08:11:11 np0005626463.localdomain podman[73157]: 2026-02-23 08:11:11.975501215 +0000 UTC m=+0.110709825 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:11:11 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:11:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:11:12 np0005626463.localdomain podman[73176]: 2026-02-23 08:11:12.091402333 +0000 UTC m=+0.084253625 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:11:12 np0005626463.localdomain podman[73176]: 2026-02-23 08:11:12.12724265 +0000 UTC m=+0.120093962 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public)
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:11:12 np0005626463.localdomain podman[73198]: 2026-02-23 08:11:12.202067255 +0000 UTC m=+0.084573675 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron)
Feb 23 08:11:12 np0005626463.localdomain podman[73198]: 2026-02-23 08:11:12.213577641 +0000 UTC m=+0.096084091 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:11:12 np0005626463.localdomain podman[73196]: 2026-02-23 08:11:12.295445528 +0000 UTC m=+0.179399094 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Feb 23 08:11:12 np0005626463.localdomain podman[73196]: 2026-02-23 08:11:12.324782669 +0000 UTC m=+0.208736205 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:11:12 np0005626463.localdomain systemd[1]: tmp-crun.bOSQiJ.mount: Deactivated successfully.
Feb 23 08:11:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:11:13 np0005626463.localdomain podman[73250]: 2026-02-23 08:11:13.883141494 +0000 UTC m=+0.066522482 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:11:14 np0005626463.localdomain podman[73250]: 2026-02-23 08:11:14.244644286 +0000 UTC m=+0.428025294 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:11:14 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:11:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:11:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:11:16 np0005626463.localdomain podman[73274]: 2026-02-23 08:11:16.940962043 +0000 UTC m=+0.110380204 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, release=1766032510)
Feb 23 08:11:16 np0005626463.localdomain podman[73274]: 2026-02-23 08:11:16.986695484 +0000 UTC m=+0.156113625 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 23 08:11:16 np0005626463.localdomain systemd[1]: tmp-crun.up6nSo.mount: Deactivated successfully.
Feb 23 08:11:17 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:11:17 np0005626463.localdomain podman[73275]: 2026-02-23 08:11:17.005280414 +0000 UTC m=+0.174173518 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 23 08:11:17 np0005626463.localdomain podman[73275]: 2026-02-23 08:11:17.049212139 +0000 UTC m=+0.218105293 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:11:17 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:11:17 np0005626463.localdomain sshd[73322]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:11:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:11:18 np0005626463.localdomain podman[73323]: 2026-02-23 08:11:18.911101715 +0000 UTC m=+0.084576455 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 23 08:11:19 np0005626463.localdomain podman[73323]: 2026-02-23 08:11:19.112060443 +0000 UTC m=+0.285535143 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1)
Feb 23 08:11:19 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:11:19 np0005626463.localdomain sshd[73352]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:11:19 np0005626463.localdomain sshd[73352]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:11:21 np0005626463.localdomain sshd[73322]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:11:25 np0005626463.localdomain sudo[73355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:11:25 np0005626463.localdomain sudo[73355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:11:25 np0005626463.localdomain sudo[73355]: pam_unix(sudo:session): session closed for user root
Feb 23 08:11:25 np0005626463.localdomain sudo[73370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:11:25 np0005626463.localdomain sudo[73370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:11:26 np0005626463.localdomain podman[73456]: 2026-02-23 08:11:26.738980953 +0000 UTC m=+0.094620393 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, ceph=True, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Feb 23 08:11:26 np0005626463.localdomain podman[73456]: 2026-02-23 08:11:26.848398666 +0000 UTC m=+0.204038096 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Feb 23 08:11:27 np0005626463.localdomain sudo[73370]: pam_unix(sudo:session): session closed for user root
Feb 23 08:11:27 np0005626463.localdomain sudo[73525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:11:27 np0005626463.localdomain sudo[73525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:11:27 np0005626463.localdomain sudo[73525]: pam_unix(sudo:session): session closed for user root
Feb 23 08:11:27 np0005626463.localdomain sudo[73540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:11:27 np0005626463.localdomain sudo[73540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:11:27 np0005626463.localdomain sudo[73540]: pam_unix(sudo:session): session closed for user root
Feb 23 08:11:29 np0005626463.localdomain sudo[73587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:11:29 np0005626463.localdomain sudo[73587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:11:29 np0005626463.localdomain sudo[73587]: pam_unix(sudo:session): session closed for user root
Feb 23 08:11:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:11:39 np0005626463.localdomain systemd[1]: tmp-crun.0vd7f6.mount: Deactivated successfully.
Feb 23 08:11:39 np0005626463.localdomain podman[73602]: 2026-02-23 08:11:39.935841287 +0000 UTC m=+0.109658161 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:11:39 np0005626463.localdomain podman[73602]: 2026-02-23 08:11:39.948286732 +0000 UTC m=+0.122103636 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:11:39 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:11:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:11:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:11:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:11:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:11:42 np0005626463.localdomain podman[73623]: 2026-02-23 08:11:42.921100374 +0000 UTC m=+0.092548118 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:11:42 np0005626463.localdomain podman[73623]: 2026-02-23 08:11:42.954215335 +0000 UTC m=+0.125663049 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:11:42 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:11:43 np0005626463.localdomain podman[73622]: 2026-02-23 08:11:43.03064722 +0000 UTC m=+0.202897639 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:11:43 np0005626463.localdomain podman[73625]: 2026-02-23 08:11:43.087081531 +0000 UTC m=+0.249971103 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-cron-container)
Feb 23 08:11:43 np0005626463.localdomain podman[73625]: 2026-02-23 08:11:43.095034494 +0000 UTC m=+0.257924086 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:11:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:11:43 np0005626463.localdomain podman[73622]: 2026-02-23 08:11:43.115644248 +0000 UTC m=+0.287894677 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:11:43 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:11:43 np0005626463.localdomain podman[73624]: 2026-02-23 08:11:43.133920758 +0000 UTC m=+0.300837558 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible)
Feb 23 08:11:43 np0005626463.localdomain podman[73624]: 2026-02-23 08:11:43.168222646 +0000 UTC m=+0.335139436 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5)
Feb 23 08:11:43 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:11:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:11:44 np0005626463.localdomain podman[73715]: 2026-02-23 08:11:44.907635546 +0000 UTC m=+0.082808878 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:11:45 np0005626463.localdomain podman[73715]: 2026-02-23 08:11:45.230260315 +0000 UTC m=+0.405433677 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:11:45 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:11:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:11:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:11:47 np0005626463.localdomain podman[73739]: 2026-02-23 08:11:47.911449145 +0000 UTC m=+0.084899896 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 23 08:11:47 np0005626463.localdomain podman[73739]: 2026-02-23 08:11:47.966490021 +0000 UTC m=+0.139940742 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:11:47 np0005626463.localdomain systemd[1]: tmp-crun.cq10J5.mount: Deactivated successfully.
Feb 23 08:11:47 np0005626463.localdomain podman[73740]: 2026-02-23 08:11:47.978865695 +0000 UTC m=+0.147174123 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:11:47 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:11:48 np0005626463.localdomain podman[73740]: 2026-02-23 08:11:48.051790019 +0000 UTC m=+0.220098377 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:11:48 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:11:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:11:49 np0005626463.localdomain podman[73789]: 2026-02-23 08:11:49.921227655 +0000 UTC m=+0.095774510 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, release=1766032510, com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:11:50 np0005626463.localdomain podman[73789]: 2026-02-23 08:11:50.118800895 +0000 UTC m=+0.293347740 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=metrics_qdr)
Feb 23 08:11:50 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:11:56 np0005626463.localdomain sshd[73816]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:11:57 np0005626463.localdomain sshd[73816]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:12:00 np0005626463.localdomain sshd[73818]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:12:01 np0005626463.localdomain sshd[73818]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:12:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:12:10 np0005626463.localdomain podman[73820]: 2026-02-23 08:12:10.927060103 +0000 UTC m=+0.099700487 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:12:10 np0005626463.localdomain podman[73820]: 2026-02-23 08:12:10.934931803 +0000 UTC m=+0.107572167 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510)
Feb 23 08:12:10 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:12:13 np0005626463.localdomain recover_tripleo_nova_virtqemud[73865]: 61982
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:12:13 np0005626463.localdomain podman[73840]: 2026-02-23 08:12:13.927526218 +0000 UTC m=+0.098534199 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:12:13 np0005626463.localdomain podman[73840]: 2026-02-23 08:12:13.968284083 +0000 UTC m=+0.139292044 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:12:13 np0005626463.localdomain podman[73841]: 2026-02-23 08:12:13.976126392 +0000 UTC m=+0.146010077 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:12:13 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:12:13 np0005626463.localdomain podman[73843]: 2026-02-23 08:12:13.996752307 +0000 UTC m=+0.157806902 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:12:14 np0005626463.localdomain podman[73842]: 2026-02-23 08:12:14.033501314 +0000 UTC m=+0.198151754 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:12:14 np0005626463.localdomain podman[73843]: 2026-02-23 08:12:14.038214914 +0000 UTC m=+0.199269529 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=logrotate_crond, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:12:14 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:12:14 np0005626463.localdomain podman[73841]: 2026-02-23 08:12:14.062191954 +0000 UTC m=+0.232075669 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git)
Feb 23 08:12:14 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:12:14 np0005626463.localdomain podman[73842]: 2026-02-23 08:12:14.08723934 +0000 UTC m=+0.251889750 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:12:14 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:12:14 np0005626463.localdomain systemd[1]: tmp-crun.moivRN.mount: Deactivated successfully.
Feb 23 08:12:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:12:15 np0005626463.localdomain systemd[1]: tmp-crun.Dp0eP3.mount: Deactivated successfully.
Feb 23 08:12:15 np0005626463.localdomain podman[73932]: 2026-02-23 08:12:15.919080276 +0000 UTC m=+0.091297569 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 23 08:12:16 np0005626463.localdomain podman[73932]: 2026-02-23 08:12:16.304448083 +0000 UTC m=+0.476665406 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team)
Feb 23 08:12:16 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:12:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:12:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:12:18 np0005626463.localdomain podman[73957]: 2026-02-23 08:12:18.904579356 +0000 UTC m=+0.075610861 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:12:18 np0005626463.localdomain podman[73957]: 2026-02-23 08:12:18.951020282 +0000 UTC m=+0.122051757 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 23 08:12:18 np0005626463.localdomain systemd[1]: tmp-crun.YN1gty.mount: Deactivated successfully.
Feb 23 08:12:18 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:12:18 np0005626463.localdomain podman[73956]: 2026-02-23 08:12:18.974946441 +0000 UTC m=+0.148154116 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public)
Feb 23 08:12:19 np0005626463.localdomain podman[73956]: 2026-02-23 08:12:19.027290993 +0000 UTC m=+0.200498638 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible)
Feb 23 08:12:19 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:12:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:12:20 np0005626463.localdomain podman[74002]: 2026-02-23 08:12:20.91516557 +0000 UTC m=+0.090066581 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, container_name=metrics_qdr, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 23 08:12:21 np0005626463.localdomain podman[74002]: 2026-02-23 08:12:21.109951204 +0000 UTC m=+0.284852245 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z)
Feb 23 08:12:21 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:12:29 np0005626463.localdomain sudo[74031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:12:29 np0005626463.localdomain sudo[74031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:12:29 np0005626463.localdomain sudo[74031]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:29 np0005626463.localdomain sudo[74046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:12:29 np0005626463.localdomain sudo[74046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:12:30 np0005626463.localdomain sudo[74046]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:30 np0005626463.localdomain sudo[74094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:12:30 np0005626463.localdomain sudo[74094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:12:30 np0005626463.localdomain sudo[74094]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:36 np0005626463.localdomain sshd[74109]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:12:37 np0005626463.localdomain sshd[74109]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:12:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:12:41 np0005626463.localdomain systemd[1]: tmp-crun.56vU8m.mount: Deactivated successfully.
Feb 23 08:12:41 np0005626463.localdomain podman[74111]: 2026-02-23 08:12:41.933147823 +0000 UTC m=+0.102226126 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3)
Feb 23 08:12:41 np0005626463.localdomain podman[74111]: 2026-02-23 08:12:41.950203755 +0000 UTC m=+0.119282068 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:12:41 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:12:43 np0005626463.localdomain sudo[74178]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zphyixfzvcybbsskniqvcesppxhciudo ; /usr/bin/python3
Feb 23 08:12:43 np0005626463.localdomain sudo[74178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:43 np0005626463.localdomain python3[74180]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:12:43 np0005626463.localdomain sudo[74178]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:43 np0005626463.localdomain sudo[74223]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akybyaluwrhqadwgwktsbxwnakzraopt ; /usr/bin/python3
Feb 23 08:12:43 np0005626463.localdomain sudo[74223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:44 np0005626463.localdomain python3[74225]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834363.3614283-114199-27635855336001/source _original_basename=tmpyj21v48e follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:12:44 np0005626463.localdomain sudo[74223]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:12:44 np0005626463.localdomain sudo[74288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llhjnvmjhxrehfmcvmrgddppudaadzev ; /usr/bin/python3
Feb 23 08:12:44 np0005626463.localdomain sudo[74288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: tmp-crun.AaxDDS.mount: Deactivated successfully.
Feb 23 08:12:44 np0005626463.localdomain podman[74241]: 2026-02-23 08:12:44.945362212 +0000 UTC m=+0.110871372 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=)
Feb 23 08:12:44 np0005626463.localdomain podman[74243]: 2026-02-23 08:12:44.978945158 +0000 UTC m=+0.137517837 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, container_name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Feb 23 08:12:44 np0005626463.localdomain podman[74241]: 2026-02-23 08:12:44.983243174 +0000 UTC m=+0.148752294 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Feb 23 08:12:44 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:12:45 np0005626463.localdomain podman[74240]: 2026-02-23 08:12:45.045067977 +0000 UTC m=+0.213531341 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:12:45 np0005626463.localdomain python3[74302]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:12:45 np0005626463.localdomain podman[74240]: 2026-02-23 08:12:45.085983887 +0000 UTC m=+0.254447201 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container)
Feb 23 08:12:45 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:12:45 np0005626463.localdomain podman[74242]: 2026-02-23 08:12:45.102577304 +0000 UTC m=+0.263622082 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public)
Feb 23 08:12:45 np0005626463.localdomain sudo[74288]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:45 np0005626463.localdomain podman[74243]: 2026-02-23 08:12:45.117536289 +0000 UTC m=+0.276109028 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:12:45 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:12:45 np0005626463.localdomain podman[74242]: 2026-02-23 08:12:45.142427979 +0000 UTC m=+0.303472747 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:12:45 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:12:45 np0005626463.localdomain sudo[74392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvccufrpirwopnfngfjjtfkdriqfsuwj ; /usr/bin/python3
Feb 23 08:12:45 np0005626463.localdomain sudo[74392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:45 np0005626463.localdomain sudo[74392]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:45 np0005626463.localdomain sudo[74410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voflgktpybdltdlbkulhkgsppolcndyz ; /usr/bin/python3
Feb 23 08:12:45 np0005626463.localdomain sudo[74410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:45 np0005626463.localdomain sudo[74410]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:46 np0005626463.localdomain sudo[74514]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufgafmkcirjtcxsptypowwzzwkultpxr ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834366.184046-114335-11440511828170/async_wrapper.py 305274563872 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834366.184046-114335-11440511828170/AnsiballZ_command.py _
Feb 23 08:12:46 np0005626463.localdomain sudo[74514]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 08:12:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:12:46 np0005626463.localdomain systemd[1]: tmp-crun.usEInn.mount: Deactivated successfully.
Feb 23 08:12:46 np0005626463.localdomain podman[74517]: 2026-02-23 08:12:46.722168492 +0000 UTC m=+0.088954406 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:12:46 np0005626463.localdomain ansible-async_wrapper.py[74516]: Invoked with 305274563872 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834366.184046-114335-11440511828170/AnsiballZ_command.py _
Feb 23 08:12:46 np0005626463.localdomain ansible-async_wrapper.py[74539]: Starting module and watcher
Feb 23 08:12:46 np0005626463.localdomain ansible-async_wrapper.py[74539]: Start watching 74540 (3600)
Feb 23 08:12:46 np0005626463.localdomain ansible-async_wrapper.py[74540]: Start module (74540)
Feb 23 08:12:46 np0005626463.localdomain ansible-async_wrapper.py[74516]: Return async_wrapper task started.
Feb 23 08:12:46 np0005626463.localdomain sudo[74514]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:46 np0005626463.localdomain sudo[74557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjuwbmsywqbztgltxhkucmbuvgpahkde ; /usr/bin/python3
Feb 23 08:12:46 np0005626463.localdomain sudo[74557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:47 np0005626463.localdomain podman[74517]: 2026-02-23 08:12:47.107216638 +0000 UTC m=+0.474002512 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:12:47 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:12:47 np0005626463.localdomain python3[74561]: ansible-ansible.legacy.async_status Invoked with jid=305274563872.74516 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:12:47 np0005626463.localdomain sudo[74557]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:12:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:12:49 np0005626463.localdomain podman[74616]: 2026-02-23 08:12:49.691475948 +0000 UTC m=+0.093619124 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:12:49 np0005626463.localdomain podman[74616]: 2026-02-23 08:12:49.724249989 +0000 UTC m=+0.126393175 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 23 08:12:49 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:12:49 np0005626463.localdomain podman[74617]: 2026-02-23 08:12:49.748277762 +0000 UTC m=+0.148584229 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 23 08:12:49 np0005626463.localdomain podman[74617]: 2026-02-23 08:12:49.798531657 +0000 UTC m=+0.198838134 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:12:49 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (file: /etc/puppet/hiera.yaml)
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: Undefined variable '::deploy_config_name';
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (file & line not available)
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (file & line not available)
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 23 08:12:50 np0005626463.localdomain puppet-user[74559]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds
Feb 23 08:12:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Notice: Applied catalog in 0.35 seconds
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Application:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:    Initial environment: production
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:    Converged environment: production
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:          Run mode: user
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Changes:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Events:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Resources:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:             Total: 19
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Time:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:           Package: 0.00
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:          Schedule: 0.00
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:            Augeas: 0.01
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:              Exec: 0.01
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:              File: 0.02
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:           Service: 0.09
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:    Config retrieval: 0.28
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:    Transaction evaluation: 0.34
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:    Catalog application: 0.35
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:          Last run: 1771834371
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:        Filebucket: 0.00
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:             Total: 0.35
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]: Version:
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:            Config: 1771834370
Feb 23 08:12:51 np0005626463.localdomain puppet-user[74559]:            Puppet: 7.10.0
Feb 23 08:12:51 np0005626463.localdomain systemd[1]: tmp-crun.vMBbBH.mount: Deactivated successfully.
Feb 23 08:12:51 np0005626463.localdomain podman[74731]: 2026-02-23 08:12:51.256255614 +0000 UTC m=+0.094544202 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:12:51 np0005626463.localdomain ansible-async_wrapper.py[74540]: Module complete (74540)
Feb 23 08:12:51 np0005626463.localdomain podman[74731]: 2026-02-23 08:12:51.469405413 +0000 UTC m=+0.307694031 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:12:51 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:12:51 np0005626463.localdomain ansible-async_wrapper.py[74539]: Done in kid B.
Feb 23 08:12:53 np0005626463.localdomain sshd[74760]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:12:55 np0005626463.localdomain sshd[74760]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:12:57 np0005626463.localdomain sudo[74775]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jelcqcszahimkmnmqujzzbqigdlkfqzi ; /usr/bin/python3
Feb 23 08:12:57 np0005626463.localdomain sudo[74775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:57 np0005626463.localdomain python3[74777]: ansible-ansible.legacy.async_status Invoked with jid=305274563872.74516 mode=status _async_dir=/tmp/.ansible_async
Feb 23 08:12:57 np0005626463.localdomain sudo[74775]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:57 np0005626463.localdomain sudo[74791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btjcbltsmdnjwrmfpygjfgyblilbrihm ; /usr/bin/python3
Feb 23 08:12:57 np0005626463.localdomain sudo[74791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:58 np0005626463.localdomain python3[74793]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:12:58 np0005626463.localdomain sudo[74791]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:58 np0005626463.localdomain sudo[74807]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drwqrgrkpscawayttckypkuwaqvvhbhn ; /usr/bin/python3
Feb 23 08:12:58 np0005626463.localdomain sudo[74807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:58 np0005626463.localdomain python3[74809]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:12:58 np0005626463.localdomain sudo[74807]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:58 np0005626463.localdomain sudo[74857]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syqtnhfqhfrfcwcsenzwefmyyffthwuk ; /usr/bin/python3
Feb 23 08:12:58 np0005626463.localdomain sudo[74857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:59 np0005626463.localdomain python3[74859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:12:59 np0005626463.localdomain sudo[74857]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:59 np0005626463.localdomain sudo[74875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkuradjsztasavivkbubeickckclpovn ; /usr/bin/python3
Feb 23 08:12:59 np0005626463.localdomain sudo[74875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:59 np0005626463.localdomain python3[74877]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpgcev01x7 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 08:12:59 np0005626463.localdomain sudo[74875]: pam_unix(sudo:session): session closed for user root
Feb 23 08:12:59 np0005626463.localdomain sudo[74905]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfdbzqawcgclxjryaqcqaaxoojfhnapd ; /usr/bin/python3
Feb 23 08:12:59 np0005626463.localdomain sudo[74905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:12:59 np0005626463.localdomain python3[74907]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:12:59 np0005626463.localdomain sudo[74905]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:00 np0005626463.localdomain sudo[74921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxlsjiekwtlfaotgzibbfnwsjnycrrtt ; /usr/bin/python3
Feb 23 08:13:00 np0005626463.localdomain sudo[74921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:00 np0005626463.localdomain sudo[74921]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:00 np0005626463.localdomain sudo[75010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thzhgsgakizxlwefcfrlcebgdgmjgyvn ; /usr/bin/python3
Feb 23 08:13:00 np0005626463.localdomain sudo[75010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:01 np0005626463.localdomain python3[75012]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 23 08:13:01 np0005626463.localdomain sudo[75010]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:01 np0005626463.localdomain sudo[75029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lauorltwlkvgblkdgrogpdhfbvumkviz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:01 np0005626463.localdomain sudo[75029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:01 np0005626463.localdomain python3[75031]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:01 np0005626463.localdomain sudo[75029]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:02 np0005626463.localdomain sudo[75045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxzrwrwzoulgpdnkeqthdfhdvgedklld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:02 np0005626463.localdomain sudo[75045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:02 np0005626463.localdomain sudo[75045]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:02 np0005626463.localdomain sudo[75061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-intnovymoqnpdxpvjpgnoqudpjghhtki ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:02 np0005626463.localdomain sudo[75061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:02 np0005626463.localdomain python3[75063]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:13:02 np0005626463.localdomain sudo[75061]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:03 np0005626463.localdomain sudo[75111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxuzqeiigqczrystwhnxatuvwaxpokps ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:03 np0005626463.localdomain sudo[75111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:03 np0005626463.localdomain python3[75113]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:03 np0005626463.localdomain sudo[75111]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:03 np0005626463.localdomain sudo[75129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnufyjoguxmcdkimhxplvgbkpahhshua ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:03 np0005626463.localdomain sudo[75129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:03 np0005626463.localdomain python3[75131]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:03 np0005626463.localdomain sudo[75129]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:04 np0005626463.localdomain sudo[75191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anrwktuzykjsigmejlwwdmwlfzcijabk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:04 np0005626463.localdomain sudo[75191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:04 np0005626463.localdomain python3[75193]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:04 np0005626463.localdomain sudo[75191]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:04 np0005626463.localdomain sudo[75209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzyvgdmsdtmjkcqiigczlmkhmslwdwai ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:04 np0005626463.localdomain sudo[75209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:04 np0005626463.localdomain python3[75211]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:04 np0005626463.localdomain sudo[75209]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:04 np0005626463.localdomain sudo[75271]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wplctdmgrawupbxbxuwdwtrovrctfsqy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:04 np0005626463.localdomain sudo[75271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:05 np0005626463.localdomain python3[75273]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:05 np0005626463.localdomain sudo[75271]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:05 np0005626463.localdomain sudo[75289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anceqiiognpxvsaletosjrybatykaddi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:05 np0005626463.localdomain sudo[75289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:05 np0005626463.localdomain python3[75291]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:05 np0005626463.localdomain sudo[75289]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:05 np0005626463.localdomain sudo[75351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kssqeckeyjczdjyrobvyggkfxowuayzb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:05 np0005626463.localdomain sudo[75351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:05 np0005626463.localdomain python3[75353]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:05 np0005626463.localdomain sudo[75351]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:06 np0005626463.localdomain sudo[75369]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztguduifcdnifrgxrmlaapnrotatyrzt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:06 np0005626463.localdomain sudo[75369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:06 np0005626463.localdomain python3[75371]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:06 np0005626463.localdomain sudo[75369]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:06 np0005626463.localdomain sudo[75399]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hazlwpuolcfebaaddbzgrsmwnrjiwvlu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:06 np0005626463.localdomain sudo[75399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:06 np0005626463.localdomain python3[75401]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:13:06 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:13:06 np0005626463.localdomain systemd-rc-local-generator[75425]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:13:06 np0005626463.localdomain systemd-sysv-generator[75431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:13:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:13:08 np0005626463.localdomain sudo[75399]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:08 np0005626463.localdomain sudo[75485]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhtfzghhgyprkkjvbcnxvffdvbaymemx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:08 np0005626463.localdomain sudo[75485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:08 np0005626463.localdomain python3[75487]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:08 np0005626463.localdomain sudo[75485]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:08 np0005626463.localdomain sudo[75503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujpnigowpvhmmvuslbtgycfapqnhjtzl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:08 np0005626463.localdomain sudo[75503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:08 np0005626463.localdomain python3[75505]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:08 np0005626463.localdomain sudo[75503]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:09 np0005626463.localdomain sudo[75565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjwduuxccfaytbpeniayvufdjrvsgtxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:09 np0005626463.localdomain sudo[75565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:09 np0005626463.localdomain python3[75567]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 23 08:13:09 np0005626463.localdomain sudo[75565]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:09 np0005626463.localdomain sudo[75583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikxasvkjndaxmkeculcsakbybmfrzzzv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:09 np0005626463.localdomain sudo[75583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:09 np0005626463.localdomain python3[75585]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:13:09 np0005626463.localdomain sudo[75583]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:09 np0005626463.localdomain sudo[75613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uimsqjktwcaboqhcnquwwfszlukdxglf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:09 np0005626463.localdomain sudo[75613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:10 np0005626463.localdomain python3[75615]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:13:10 np0005626463.localdomain systemd-sysv-generator[75646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:13:10 np0005626463.localdomain systemd-rc-local-generator[75640]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 08:13:10 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 08:13:10 np0005626463.localdomain sudo[75613]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:11 np0005626463.localdomain sudo[75670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvemsrqhaxejlrpndxivdznhvxophlvr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:11 np0005626463.localdomain sudo[75670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:11 np0005626463.localdomain python3[75672]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 23 08:13:11 np0005626463.localdomain sudo[75670]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:11 np0005626463.localdomain sudo[75686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfjrvvfuielhdmpkgczrhbgdoaffnuyy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:11 np0005626463.localdomain sudo[75686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:12 np0005626463.localdomain sudo[75686]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:13:12 np0005626463.localdomain systemd[1]: tmp-crun.OEajZb.mount: Deactivated successfully.
Feb 23 08:13:12 np0005626463.localdomain podman[75715]: 2026-02-23 08:13:12.940957638 +0000 UTC m=+0.111417055 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, distribution-scope=public)
Feb 23 08:13:12 np0005626463.localdomain podman[75715]: 2026-02-23 08:13:12.983482815 +0000 UTC m=+0.153942192 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=)
Feb 23 08:13:12 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:13:13 np0005626463.localdomain sudo[75747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgaxbeyppwdnromxvbqdlefaqsrnhouu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:13:13 np0005626463.localdomain sudo[75747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:13:13 np0005626463.localdomain python3[75749]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 23 08:13:14 np0005626463.localdomain podman[75790]: 2026-02-23 08:13:14.112424721 +0000 UTC m=+0.099928898 container create c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git)
Feb 23 08:13:14 np0005626463.localdomain podman[75790]: 2026-02-23 08:13:14.061576235 +0000 UTC m=+0.049080412 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started libpod-conmon-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:13:14 np0005626463.localdomain recover_tripleo_nova_virtqemud[75808]: 61982
Feb 23 08:13:14 np0005626463.localdomain podman[75790]: 2026-02-23 08:13:14.2330997 +0000 UTC m=+0.220603867 container init c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64)
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: tmp-crun.u0L254.mount: Deactivated successfully.
Feb 23 08:13:14 np0005626463.localdomain sudo[75812]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:13:14 np0005626463.localdomain podman[75790]: 2026-02-23 08:13:14.283246755 +0000 UTC m=+0.270750922 container start c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:13:14 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:13:14 np0005626463.localdomain python3[75749]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:13:14 np0005626463.localdomain podman[75813]: 2026-02-23 08:13:14.388097894 +0000 UTC m=+0.094658364 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5)
Feb 23 08:13:14 np0005626463.localdomain podman[75813]: 2026-02-23 08:13:14.446251516 +0000 UTC m=+0.152812016 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5)
Feb 23 08:13:14 np0005626463.localdomain podman[75813]: unhealthy
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Queued start job for default target Main User Target.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Created slice User Application Slice.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Reached target Paths.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Reached target Timers.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Starting D-Bus User Message Bus Socket...
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Starting Create User's Volatile Files and Directories...
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Listening on D-Bus User Message Bus Socket.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Reached target Sockets.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Finished Create User's Volatile Files and Directories.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Reached target Basic System.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Reached target Main User Target.
Feb 23 08:13:14 np0005626463.localdomain systemd[75831]: Startup finished in 161ms.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started User Manager for UID 0.
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started Session c10 of User root.
Feb 23 08:13:14 np0005626463.localdomain sudo[75812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 23 08:13:14 np0005626463.localdomain sudo[75812]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Feb 23 08:13:14 np0005626463.localdomain podman[75914]: 2026-02-23 08:13:14.792671682 +0000 UTC m=+0.081004562 container create e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started libpod-conmon-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope.
Feb 23 08:13:14 np0005626463.localdomain podman[75914]: 2026-02-23 08:13:14.742333642 +0000 UTC m=+0.030666502 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:13:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 08:13:14 np0005626463.localdomain podman[75914]: 2026-02-23 08:13:14.870216815 +0000 UTC m=+0.158549635 container init e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:13:14 np0005626463.localdomain podman[75914]: 2026-02-23 08:13:14.881080062 +0000 UTC m=+0.169412902 container start e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, vcs-type=git)
Feb 23 08:13:14 np0005626463.localdomain podman[75914]: 2026-02-23 08:13:14.881501485 +0000 UTC m=+0.169834325 container attach e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:13:14 np0005626463.localdomain sudo[75934]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 08:13:14 np0005626463.localdomain sudo[75934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 23 08:13:14 np0005626463.localdomain sudo[75934]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:13:15 np0005626463.localdomain podman[75938]: 2026-02-23 08:13:15.174169224 +0000 UTC m=+0.099154753 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:13:15 np0005626463.localdomain podman[75938]: 2026-02-23 08:13:15.204292958 +0000 UTC m=+0.129278497 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13)
Feb 23 08:13:15 np0005626463.localdomain podman[75950]: 2026-02-23 08:13:15.241876672 +0000 UTC m=+0.090363831 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public)
Feb 23 08:13:15 np0005626463.localdomain podman[75972]: 2026-02-23 08:13:15.253034378 +0000 UTC m=+0.069770133 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 23 08:13:15 np0005626463.localdomain podman[75972]: 2026-02-23 08:13:15.278420254 +0000 UTC m=+0.095156009 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:13:15 np0005626463.localdomain podman[75950]: 2026-02-23 08:13:15.333513702 +0000 UTC m=+0.182000801 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:13:15 np0005626463.localdomain podman[75949]: 2026-02-23 08:13:15.283998977 +0000 UTC m=+0.137398099 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:13:15 np0005626463.localdomain podman[75949]: 2026-02-23 08:13:15.416532955 +0000 UTC m=+0.269932067 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:13:15 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:13:16 np0005626463.localdomain sshd[76030]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:13:16 np0005626463.localdomain sshd[76030]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:13:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:13:17 np0005626463.localdomain podman[76032]: 2026-02-23 08:13:17.902775212 +0000 UTC m=+0.079017840 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:13:18 np0005626463.localdomain podman[76032]: 2026-02-23 08:13:18.28312774 +0000 UTC m=+0.459370308 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1)
Feb 23 08:13:18 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:13:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:13:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:13:19 np0005626463.localdomain systemd[1]: tmp-crun.dgOSWX.mount: Deactivated successfully.
Feb 23 08:13:19 np0005626463.localdomain podman[76054]: 2026-02-23 08:13:19.929085088 +0000 UTC m=+0.101773325 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, release=1766032510, vcs-type=git, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 23 08:13:19 np0005626463.localdomain podman[76055]: 2026-02-23 08:13:19.974803295 +0000 UTC m=+0.144761868 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:13:20 np0005626463.localdomain podman[76054]: 2026-02-23 08:13:20.010180782 +0000 UTC m=+0.182869019 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:13:20 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:13:20 np0005626463.localdomain podman[76055]: 2026-02-23 08:13:20.04918768 +0000 UTC m=+0.219146283 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:13:20 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:13:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:13:21 np0005626463.localdomain podman[76102]: 2026-02-23 08:13:21.918949904 +0000 UTC m=+0.089647789 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=metrics_qdr, version=17.1.13)
Feb 23 08:13:22 np0005626463.localdomain podman[76102]: 2026-02-23 08:13:22.153510493 +0000 UTC m=+0.324208318 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:13:22 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Activating special unit Exit the Session...
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped target Main User Target.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped target Basic System.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped target Paths.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped target Sockets.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped target Timers.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Closed D-Bus User Message Bus Socket.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Stopped Create User's Volatile Files and Directories.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Removed slice User Application Slice.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Reached target Shutdown.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Finished Exit the Session.
Feb 23 08:13:24 np0005626463.localdomain systemd[75831]: Reached target Exit the Session.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 08:13:24 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 23 08:13:31 np0005626463.localdomain sudo[76133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:13:31 np0005626463.localdomain sudo[76133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:13:31 np0005626463.localdomain sudo[76133]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:31 np0005626463.localdomain sudo[76148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:13:31 np0005626463.localdomain sudo[76148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:13:31 np0005626463.localdomain sudo[76148]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:32 np0005626463.localdomain sudo[76195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:13:32 np0005626463.localdomain sudo[76195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:13:32 np0005626463.localdomain sudo[76195]: pam_unix(sudo:session): session closed for user root
Feb 23 08:13:42 np0005626463.localdomain sshd[76210]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:13:43 np0005626463.localdomain sshd[76210]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:13:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:13:43 np0005626463.localdomain podman[76212]: 2026-02-23 08:13:43.482460649 +0000 UTC m=+0.094079757 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=collectd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:13:43 np0005626463.localdomain podman[76212]: 2026-02-23 08:13:43.497249988 +0000 UTC m=+0.108869146 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:13:43 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:13:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:13:44 np0005626463.localdomain podman[76233]: 2026-02-23 08:13:44.894897909 +0000 UTC m=+0.071980090 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:13:44 np0005626463.localdomain podman[76233]: 2026-02-23 08:13:44.95684316 +0000 UTC m=+0.133925341 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5)
Feb 23 08:13:44 np0005626463.localdomain podman[76233]: unhealthy
Feb 23 08:13:44 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:13:44 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: tmp-crun.SUgLPN.mount: Deactivated successfully.
Feb 23 08:13:45 np0005626463.localdomain podman[76256]: 2026-02-23 08:13:45.916916891 +0000 UTC m=+0.090367791 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, container_name=iscsid, release=1766032510)
Feb 23 08:13:45 np0005626463.localdomain podman[76256]: 2026-02-23 08:13:45.930291986 +0000 UTC m=+0.103742906 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:13:45 np0005626463.localdomain podman[76259]: 2026-02-23 08:13:45.974716143 +0000 UTC m=+0.140479535 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:13:45 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:13:46 np0005626463.localdomain podman[76258]: 2026-02-23 08:13:46.074207476 +0000 UTC m=+0.242362082 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64)
Feb 23 08:13:46 np0005626463.localdomain podman[76259]: 2026-02-23 08:13:46.089892482 +0000 UTC m=+0.255655914 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:13:46 np0005626463.localdomain podman[76257]: 2026-02-23 08:13:46.039226442 +0000 UTC m=+0.208173522 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 23 08:13:46 np0005626463.localdomain podman[76257]: 2026-02-23 08:13:46.123345279 +0000 UTC m=+0.292292349 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 23 08:13:46 np0005626463.localdomain podman[76258]: 2026-02-23 08:13:46.131584315 +0000 UTC m=+0.299738851 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:13:46 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:13:46 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:13:46 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:13:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:13:48 np0005626463.localdomain podman[76350]: 2026-02-23 08:13:48.905336593 +0000 UTC m=+0.082415565 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.13, container_name=nova_migration_target, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:13:49 np0005626463.localdomain podman[76350]: 2026-02-23 08:13:49.294381029 +0000 UTC m=+0.471459971 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:13:49 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:13:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:13:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:13:50 np0005626463.localdomain podman[76372]: 2026-02-23 08:13:50.90395137 +0000 UTC m=+0.079584888 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:13:50 np0005626463.localdomain podman[76372]: 2026-02-23 08:13:50.956125666 +0000 UTC m=+0.131759154 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:13:50 np0005626463.localdomain podman[76373]: 2026-02-23 08:13:50.969386467 +0000 UTC m=+0.139092892 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:13:50 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:13:51 np0005626463.localdomain podman[76373]: 2026-02-23 08:13:51.020919594 +0000 UTC m=+0.190626029 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:13:51 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:13:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:13:52 np0005626463.localdomain podman[76418]: 2026-02-23 08:13:52.92076705 +0000 UTC m=+0.092253429 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 23 08:13:53 np0005626463.localdomain podman[76418]: 2026-02-23 08:13:53.136410503 +0000 UTC m=+0.307896882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:13:53 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:13:57 np0005626463.localdomain sshd[76448]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:13:58 np0005626463.localdomain sshd[76448]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:14:01 np0005626463.localdomain sshd[35469]: Received disconnect from 192.168.122.100 port 36462:11: disconnected by user
Feb 23 08:14:01 np0005626463.localdomain sshd[35469]: Disconnected from user zuul 192.168.122.100 port 36462
Feb 23 08:14:01 np0005626463.localdomain sshd[35466]: pam_unix(sshd:session): session closed for user zuul
Feb 23 08:14:01 np0005626463.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Feb 23 08:14:01 np0005626463.localdomain systemd[1]: session-27.scope: Consumed 3.078s CPU time.
Feb 23 08:14:01 np0005626463.localdomain systemd-logind[759]: Session 27 logged out. Waiting for processes to exit.
Feb 23 08:14:01 np0005626463.localdomain systemd-logind[759]: Removed session 27.
Feb 23 08:14:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:14:13 np0005626463.localdomain podman[76450]: 2026-02-23 08:14:13.907922954 +0000 UTC m=+0.082668538 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, release=1766032510, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd)
Feb 23 08:14:13 np0005626463.localdomain podman[76450]: 2026-02-23 08:14:13.924577969 +0000 UTC m=+0.099323523 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:14:13 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:14:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:14:15 np0005626463.localdomain podman[76470]: 2026-02-23 08:14:15.911006433 +0000 UTC m=+0.083731401 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 23 08:14:15 np0005626463.localdomain podman[76470]: 2026-02-23 08:14:15.978373696 +0000 UTC m=+0.151098634 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:14:15 np0005626463.localdomain podman[76470]: unhealthy
Feb 23 08:14:15 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:14:15 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:14:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:14:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:14:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:14:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:14:16 np0005626463.localdomain systemd[1]: tmp-crun.7wOPKA.mount: Deactivated successfully.
Feb 23 08:14:16 np0005626463.localdomain podman[76496]: 2026-02-23 08:14:16.984168835 +0000 UTC m=+0.150108034 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:14:17 np0005626463.localdomain podman[76496]: 2026-02-23 08:14:17.017410683 +0000 UTC m=+0.183349842 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public)
Feb 23 08:14:17 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:14:17 np0005626463.localdomain podman[76493]: 2026-02-23 08:14:17.057674454 +0000 UTC m=+0.231654857 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.13, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:14:17 np0005626463.localdomain podman[76494]: 2026-02-23 08:14:17.022204319 +0000 UTC m=+0.191347435 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:14:17 np0005626463.localdomain podman[76493]: 2026-02-23 08:14:17.073505575 +0000 UTC m=+0.247486038 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:14:17 np0005626463.localdomain podman[76495]: 2026-02-23 08:14:16.935102287 +0000 UTC m=+0.102491580 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:14:17 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:14:17 np0005626463.localdomain podman[76495]: 2026-02-23 08:14:17.114783507 +0000 UTC m=+0.282172850 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 23 08:14:17 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:14:17 np0005626463.localdomain podman[76494]: 2026-02-23 08:14:17.158526403 +0000 UTC m=+0.327669519 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13)
Feb 23 08:14:17 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:14:17 np0005626463.localdomain systemd[1]: tmp-crun.bZvGKK.mount: Deactivated successfully.
Feb 23 08:14:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:14:19 np0005626463.localdomain systemd[1]: tmp-crun.DIOh5Z.mount: Deactivated successfully.
Feb 23 08:14:19 np0005626463.localdomain podman[76586]: 2026-02-23 08:14:19.911016604 +0000 UTC m=+0.085589198 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:14:20 np0005626463.localdomain podman[76586]: 2026-02-23 08:14:20.256771361 +0000 UTC m=+0.431343915 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:14:20 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:14:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:14:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:14:21 np0005626463.localdomain podman[76611]: 2026-02-23 08:14:21.911798753 +0000 UTC m=+0.082477752 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:14:21 np0005626463.localdomain podman[76610]: 2026-02-23 08:14:21.962106149 +0000 UTC m=+0.134472129 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 23 08:14:21 np0005626463.localdomain podman[76611]: 2026-02-23 08:14:21.982035144 +0000 UTC m=+0.152714143 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.)
Feb 23 08:14:21 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:14:22 np0005626463.localdomain podman[76610]: 2026-02-23 08:14:22.013271971 +0000 UTC m=+0.185637961 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:14:22 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:14:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:14:23 np0005626463.localdomain podman[76654]: 2026-02-23 08:14:23.908028105 +0000 UTC m=+0.083355189 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, url=https://www.redhat.com)
Feb 23 08:14:24 np0005626463.localdomain podman[76654]: 2026-02-23 08:14:24.111319731 +0000 UTC m=+0.286646805 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, tcib_managed=true, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:14:24 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:14:32 np0005626463.localdomain sshd[76696]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:14:32 np0005626463.localdomain sudo[76683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:14:32 np0005626463.localdomain sudo[76683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:14:32 np0005626463.localdomain sudo[76683]: pam_unix(sudo:session): session closed for user root
Feb 23 08:14:32 np0005626463.localdomain sudo[76699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:14:32 np0005626463.localdomain sudo[76699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:14:33 np0005626463.localdomain sshd[76696]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:14:33 np0005626463.localdomain sudo[76699]: pam_unix(sudo:session): session closed for user root
Feb 23 08:14:34 np0005626463.localdomain sudo[76748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:14:34 np0005626463.localdomain sudo[76748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:14:34 np0005626463.localdomain sudo[76748]: pam_unix(sudo:session): session closed for user root
Feb 23 08:14:41 np0005626463.localdomain sshd[76763]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:14:41 np0005626463.localdomain sshd[76763]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:14:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:14:44 np0005626463.localdomain podman[76765]: 2026-02-23 08:14:44.919274202 +0000 UTC m=+0.091173076 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:14:44 np0005626463.localdomain podman[76765]: 2026-02-23 08:14:44.957559093 +0000 UTC m=+0.129457967 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 08:14:44 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:14:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:14:46 np0005626463.localdomain systemd[1]: tmp-crun.iFimzD.mount: Deactivated successfully.
Feb 23 08:14:46 np0005626463.localdomain podman[76785]: 2026-02-23 08:14:46.91734653 +0000 UTC m=+0.093799076 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 23 08:14:46 np0005626463.localdomain podman[76785]: 2026-02-23 08:14:46.976218885 +0000 UTC m=+0.152671471 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13)
Feb 23 08:14:46 np0005626463.localdomain podman[76785]: unhealthy
Feb 23 08:14:46 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:14:46 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:14:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:14:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:14:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:14:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:14:47 np0005626463.localdomain podman[76808]: 2026-02-23 08:14:47.915450815 +0000 UTC m=+0.086469694 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:14:47 np0005626463.localdomain podman[76808]: 2026-02-23 08:14:47.953764707 +0000 UTC m=+0.124783576 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 23 08:14:47 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:14:47 np0005626463.localdomain podman[76809]: 2026-02-23 08:14:47.981309582 +0000 UTC m=+0.150499646 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:14:48 np0005626463.localdomain systemd[1]: tmp-crun.rxzvrp.mount: Deactivated successfully.
Feb 23 08:14:48 np0005626463.localdomain podman[76811]: 2026-02-23 08:14:48.03166645 +0000 UTC m=+0.195475591 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:14:48 np0005626463.localdomain podman[76809]: 2026-02-23 08:14:48.04024215 +0000 UTC m=+0.209432204 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:47Z, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:14:48 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:14:48 np0005626463.localdomain podman[76811]: 2026-02-23 08:14:48.071181548 +0000 UTC m=+0.234990699 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, tcib_managed=true)
Feb 23 08:14:48 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:14:48 np0005626463.localdomain podman[76810]: 2026-02-23 08:14:48.12464962 +0000 UTC m=+0.289284825 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true)
Feb 23 08:14:48 np0005626463.localdomain podman[76810]: 2026-02-23 08:14:48.154131475 +0000 UTC m=+0.318766700 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:14:48 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:14:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:14:50 np0005626463.localdomain podman[76896]: 2026-02-23 08:14:50.904993706 +0000 UTC m=+0.081637077 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:14:51 np0005626463.localdomain podman[76896]: 2026-02-23 08:14:51.290385045 +0000 UTC m=+0.467028456 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:14:51 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:14:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:14:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:14:52 np0005626463.localdomain podman[76919]: 2026-02-23 08:14:52.919507201 +0000 UTC m=+0.089709942 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true)
Feb 23 08:14:52 np0005626463.localdomain sshd[76955]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:14:52 np0005626463.localdomain podman[76919]: 2026-02-23 08:14:52.975277163 +0000 UTC m=+0.145479904 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 23 08:14:52 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:14:53 np0005626463.localdomain podman[76920]: 2026-02-23 08:14:52.978943754 +0000 UTC m=+0.148082023 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:14:53 np0005626463.localdomain podman[76920]: 2026-02-23 08:14:53.061224529 +0000 UTC m=+0.230362718 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, release=1766032510, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64)
Feb 23 08:14:53 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:14:54 np0005626463.localdomain sshd[76955]: Invalid user squid from 185.156.73.233 port 39494
Feb 23 08:14:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:14:54 np0005626463.localdomain podman[76970]: 2026-02-23 08:14:54.523934208 +0000 UTC m=+0.089090634 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team)
Feb 23 08:14:54 np0005626463.localdomain sshd[76955]: Connection closed by invalid user squid 185.156.73.233 port 39494 [preauth]
Feb 23 08:14:54 np0005626463.localdomain podman[76970]: 2026-02-23 08:14:54.723320845 +0000 UTC m=+0.288477231 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:14:54 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:15:07 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:15:07 np0005626463.localdomain recover_tripleo_nova_virtqemud[77000]: 61982
Feb 23 08:15:07 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:15:07 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:15:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:15:15 np0005626463.localdomain systemd[1]: tmp-crun.OGEkLj.mount: Deactivated successfully.
Feb 23 08:15:15 np0005626463.localdomain podman[77001]: 2026-02-23 08:15:15.917404436 +0000 UTC m=+0.090120683 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git)
Feb 23 08:15:15 np0005626463.localdomain podman[77001]: 2026-02-23 08:15:15.931170788 +0000 UTC m=+0.103887055 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:15:15 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:15:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:15:17 np0005626463.localdomain systemd[1]: tmp-crun.yrKcAM.mount: Deactivated successfully.
Feb 23 08:15:17 np0005626463.localdomain podman[77021]: 2026-02-23 08:15:17.917906793 +0000 UTC m=+0.092155152 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:15:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:15:18 np0005626463.localdomain podman[77021]: 2026-02-23 08:15:18.006297886 +0000 UTC m=+0.180546205 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:15:18 np0005626463.localdomain podman[77021]: unhealthy
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: tmp-crun.gVYIzh.mount: Deactivated successfully.
Feb 23 08:15:18 np0005626463.localdomain podman[77043]: 2026-02-23 08:15:18.107178002 +0000 UTC m=+0.095360546 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, io.openshift.expose-services=, tcib_managed=true)
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:15:18 np0005626463.localdomain podman[77043]: 2026-02-23 08:15:18.158636245 +0000 UTC m=+0.146818789 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, container_name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:15:18 np0005626463.localdomain podman[77063]: 2026-02-23 08:15:18.238394696 +0000 UTC m=+0.095581223 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:15:18 np0005626463.localdomain podman[77063]: 2026-02-23 08:15:18.271484242 +0000 UTC m=+0.128670739 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:15:18 np0005626463.localdomain podman[77064]: 2026-02-23 08:15:18.287755527 +0000 UTC m=+0.142557805 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z)
Feb 23 08:15:18 np0005626463.localdomain podman[77064]: 2026-02-23 08:15:18.299341856 +0000 UTC m=+0.154144144 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:15:18 np0005626463.localdomain podman[77084]: 2026-02-23 08:15:18.390044925 +0000 UTC m=+0.188061774 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5)
Feb 23 08:15:18 np0005626463.localdomain podman[77084]: 2026-02-23 08:15:18.42033319 +0000 UTC m=+0.218350019 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi)
Feb 23 08:15:18 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:15:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:15:21 np0005626463.localdomain podman[77136]: 2026-02-23 08:15:21.906586788 +0000 UTC m=+0.078839584 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:15:22 np0005626463.localdomain podman[77136]: 2026-02-23 08:15:22.317585304 +0000 UTC m=+0.489838100 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git)
Feb 23 08:15:22 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:15:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:15:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:15:23 np0005626463.localdomain podman[77160]: 2026-02-23 08:15:23.909382253 +0000 UTC m=+0.083429748 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:15:23 np0005626463.localdomain podman[77160]: 2026-02-23 08:15:23.935507507 +0000 UTC m=+0.109554982 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:15:23 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:15:24 np0005626463.localdomain podman[77161]: 2026-02-23 08:15:24.021826667 +0000 UTC m=+0.190017860 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 23 08:15:24 np0005626463.localdomain podman[77161]: 2026-02-23 08:15:24.090217486 +0000 UTC m=+0.258408709 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64)
Feb 23 08:15:24 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:15:24 np0005626463.localdomain sshd[77209]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:15:24 np0005626463.localdomain sshd[77209]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:15:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:15:24 np0005626463.localdomain systemd[1]: tmp-crun.yCvp0K.mount: Deactivated successfully.
Feb 23 08:15:24 np0005626463.localdomain podman[77211]: 2026-02-23 08:15:24.945441198 +0000 UTC m=+0.119048738 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:15:25 np0005626463.localdomain podman[77211]: 2026-02-23 08:15:25.138325602 +0000 UTC m=+0.311933172 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:15:25 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:15:25 np0005626463.localdomain sshd[77241]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:15:25 np0005626463.localdomain sshd[77241]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:15:34 np0005626463.localdomain sudo[77243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:15:34 np0005626463.localdomain sudo[77243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:15:34 np0005626463.localdomain sudo[77243]: pam_unix(sudo:session): session closed for user root
Feb 23 08:15:34 np0005626463.localdomain sudo[77258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:15:34 np0005626463.localdomain sudo[77258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:15:35 np0005626463.localdomain sudo[77258]: pam_unix(sudo:session): session closed for user root
Feb 23 08:15:35 np0005626463.localdomain sudo[77305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:15:35 np0005626463.localdomain sudo[77305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:15:35 np0005626463.localdomain sudo[77305]: pam_unix(sudo:session): session closed for user root
Feb 23 08:15:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:15:46 np0005626463.localdomain podman[77320]: 2026-02-23 08:15:46.914738469 +0000 UTC m=+0.087442025 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.41.5)
Feb 23 08:15:46 np0005626463.localdomain podman[77320]: 2026-02-23 08:15:46.926808763 +0000 UTC m=+0.099512279 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:15:46 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:15:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:15:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:15:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:15:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:15:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:15:48 np0005626463.localdomain podman[77341]: 2026-02-23 08:15:48.929563776 +0000 UTC m=+0.103967209 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 23 08:15:49 np0005626463.localdomain podman[77342]: 2026-02-23 08:15:49.009438679 +0000 UTC m=+0.177647260 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510)
Feb 23 08:15:49 np0005626463.localdomain podman[77340]: 2026-02-23 08:15:48.97934468 +0000 UTC m=+0.154278328 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:15:49 np0005626463.localdomain podman[77340]: 2026-02-23 08:15:49.05979705 +0000 UTC m=+0.234730678 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 23 08:15:49 np0005626463.localdomain podman[77343]: 2026-02-23 08:15:49.075093816 +0000 UTC m=+0.242167934 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:15:49 np0005626463.localdomain podman[77341]: 2026-02-23 08:15:49.08511419 +0000 UTC m=+0.259517603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:15:49 np0005626463.localdomain podman[77347]: 2026-02-23 08:15:49.16181851 +0000 UTC m=+0.330223557 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:15:49 np0005626463.localdomain podman[77342]: 2026-02-23 08:15:49.188112909 +0000 UTC m=+0.356321460 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:15:49 np0005626463.localdomain podman[77343]: 2026-02-23 08:15:49.240294182 +0000 UTC m=+0.407368300 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=logrotate_crond)
Feb 23 08:15:49 np0005626463.localdomain podman[77347]: 2026-02-23 08:15:49.248330837 +0000 UTC m=+0.416735884 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:15:49 np0005626463.localdomain podman[77347]: unhealthy
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:15:49 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:15:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:15:52 np0005626463.localdomain podman[77453]: 2026-02-23 08:15:52.912232035 +0000 UTC m=+0.085484137 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 23 08:15:53 np0005626463.localdomain podman[77453]: 2026-02-23 08:15:53.307429109 +0000 UTC m=+0.480681201 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:15:53 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:15:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:15:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:15:54 np0005626463.localdomain systemd[1]: tmp-crun.OB8mQY.mount: Deactivated successfully.
Feb 23 08:15:54 np0005626463.localdomain podman[77476]: 2026-02-23 08:15:54.928835883 +0000 UTC m=+0.101347231 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:15:54 np0005626463.localdomain podman[77476]: 2026-02-23 08:15:54.959224401 +0000 UTC m=+0.131735769 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:15:54 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:15:54 np0005626463.localdomain podman[77477]: 2026-02-23 08:15:54.985663813 +0000 UTC m=+0.153674199 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510)
Feb 23 08:15:55 np0005626463.localdomain podman[77477]: 2026-02-23 08:15:55.034315055 +0000 UTC m=+0.202325451 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:56:19Z)
Feb 23 08:15:55 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:15:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:15:55 np0005626463.localdomain systemd[1]: tmp-crun.Bob56m.mount: Deactivated successfully.
Feb 23 08:15:55 np0005626463.localdomain podman[77524]: 2026-02-23 08:15:55.916295369 +0000 UTC m=+0.088768815 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:15:56 np0005626463.localdomain podman[77524]: 2026-02-23 08:15:56.07655892 +0000 UTC m=+0.249032356 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:15:56 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:16:06 np0005626463.localdomain sshd[77553]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:16:06 np0005626463.localdomain sshd[77553]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:16:14 np0005626463.localdomain sshd[77555]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:16:17 np0005626463.localdomain sshd[77555]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:16:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:16:17 np0005626463.localdomain podman[77560]: 2026-02-23 08:16:17.49528998 +0000 UTC m=+0.095333677 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:16:17 np0005626463.localdomain podman[77560]: 2026-02-23 08:16:17.505610901 +0000 UTC m=+0.105654618 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Feb 23 08:16:17 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:16:19 np0005626463.localdomain podman[77648]: 2026-02-23 08:16:19.928232668 +0000 UTC m=+0.096238397 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git)
Feb 23 08:16:19 np0005626463.localdomain podman[77648]: 2026-02-23 08:16:19.960314473 +0000 UTC m=+0.128320232 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute)
Feb 23 08:16:19 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:16:19 np0005626463.localdomain podman[77647]: 2026-02-23 08:16:19.97764677 +0000 UTC m=+0.147556537 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team)
Feb 23 08:16:20 np0005626463.localdomain podman[77647]: 2026-02-23 08:16:20.015387537 +0000 UTC m=+0.185297314 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:16:20 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:16:20 np0005626463.localdomain podman[77650]: 2026-02-23 08:16:20.046790972 +0000 UTC m=+0.207038296 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 23 08:16:20 np0005626463.localdomain podman[77649]: 2026-02-23 08:16:20.083212139 +0000 UTC m=+0.247116974 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510)
Feb 23 08:16:20 np0005626463.localdomain podman[77656]: 2026-02-23 08:16:20.14338753 +0000 UTC m=+0.301789298 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5)
Feb 23 08:16:20 np0005626463.localdomain podman[77649]: 2026-02-23 08:16:20.151141015 +0000 UTC m=+0.315045850 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:16:20 np0005626463.localdomain podman[77650]: 2026-02-23 08:16:20.157846579 +0000 UTC m=+0.318093933 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond)
Feb 23 08:16:20 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:16:20 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:16:20 np0005626463.localdomain podman[77656]: 2026-02-23 08:16:20.210302314 +0000 UTC m=+0.368704142 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 23 08:16:20 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:16:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:16:23 np0005626463.localdomain systemd[1]: tmp-crun.iQNJWv.mount: Deactivated successfully.
Feb 23 08:16:23 np0005626463.localdomain podman[77782]: 2026-02-23 08:16:23.919348578 +0000 UTC m=+0.096794544 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:16:24 np0005626463.localdomain podman[77782]: 2026-02-23 08:16:24.303973612 +0000 UTC m=+0.481419578 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z)
Feb 23 08:16:24 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:16:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:16:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:16:25 np0005626463.localdomain systemd[1]: tmp-crun.lN1p8B.mount: Deactivated successfully.
Feb 23 08:16:25 np0005626463.localdomain podman[77807]: 2026-02-23 08:16:25.923531015 +0000 UTC m=+0.093738341 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.)
Feb 23 08:16:25 np0005626463.localdomain podman[77806]: 2026-02-23 08:16:25.970159873 +0000 UTC m=+0.142164184 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com)
Feb 23 08:16:25 np0005626463.localdomain podman[77806]: 2026-02-23 08:16:25.996595506 +0000 UTC m=+0.168599817 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public)
Feb 23 08:16:26 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:16:26 np0005626463.localdomain podman[77807]: 2026-02-23 08:16:26.057136087 +0000 UTC m=+0.227343393 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:16:26 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:16:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:16:26 np0005626463.localdomain podman[77853]: 2026-02-23 08:16:26.916699673 +0000 UTC m=+0.089783041 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, container_name=metrics_qdr, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5)
Feb 23 08:16:27 np0005626463.localdomain podman[77853]: 2026-02-23 08:16:27.151507771 +0000 UTC m=+0.324591169 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Feb 23 08:16:27 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:16:27 np0005626463.localdomain systemd[1]: libpod-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope: Deactivated successfully.
Feb 23 08:16:27 np0005626463.localdomain podman[77880]: 2026-02-23 08:16:27.810434867 +0000 UTC m=+0.066883185 container died e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_wait_for_compute_service, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']})
Feb 23 08:16:27 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a-userdata-shm.mount: Deactivated successfully.
Feb 23 08:16:27 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully.
Feb 23 08:16:27 np0005626463.localdomain podman[77880]: 2026-02-23 08:16:27.852483865 +0000 UTC m=+0.108932153 container cleanup e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true)
Feb 23 08:16:27 np0005626463.localdomain systemd[1]: libpod-conmon-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope: Deactivated successfully.
Feb 23 08:16:27 np0005626463.localdomain python3[75749]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 23 08:16:28 np0005626463.localdomain sudo[75747]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:28 np0005626463.localdomain sudo[77929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eknvvxgrjgqnfscgfimhnogkgjxvtbwz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:16:28 np0005626463.localdomain sudo[77929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:28 np0005626463.localdomain python3[77931]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:16:28 np0005626463.localdomain sudo[77929]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:28 np0005626463.localdomain sudo[77945]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkmdqndmntxzrvytlkyvrreumyhfjyjh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:16:28 np0005626463.localdomain sudo[77945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:28 np0005626463.localdomain python3[77947]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 23 08:16:28 np0005626463.localdomain sudo[77945]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:29 np0005626463.localdomain sudo[78006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqarclpbncotuajvosfbusgyycewoeov ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:16:29 np0005626463.localdomain sudo[78006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:29 np0005626463.localdomain python3[78008]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834588.832133-118967-132864162348382/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:16:29 np0005626463.localdomain sudo[78006]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:29 np0005626463.localdomain sudo[78022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqmintioifebpdjpleevjzrrnzesbnzl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:16:29 np0005626463.localdomain sudo[78022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:29 np0005626463.localdomain python3[78024]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 08:16:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:16:29 np0005626463.localdomain systemd-sysv-generator[78054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:16:29 np0005626463.localdomain systemd-rc-local-generator[78049]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:16:30 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:16:30 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:16:30 np0005626463.localdomain sudo[78022]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:30 np0005626463.localdomain recover_tripleo_nova_virtqemud[78062]: 61982
Feb 23 08:16:30 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:16:30 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:16:30 np0005626463.localdomain sudo[78076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooabyfsvjsfmulvcjpengcxsxclwxbax ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 23 08:16:30 np0005626463.localdomain sudo[78076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:30 np0005626463.localdomain python3[78078]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:16:30 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:16:31 np0005626463.localdomain systemd-sysv-generator[78110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:16:31 np0005626463.localdomain systemd-rc-local-generator[78105]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:16:31 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:16:31 np0005626463.localdomain systemd[1]: Starting nova_compute container...
Feb 23 08:16:31 np0005626463.localdomain tripleo-start-podman-container[78118]: Creating additional drop-in dependency for "nova_compute" (c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442)
Feb 23 08:16:31 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:16:31 np0005626463.localdomain systemd-rc-local-generator[78173]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:16:31 np0005626463.localdomain systemd-sysv-generator[78177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:16:31 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:16:31 np0005626463.localdomain systemd[1]: Started nova_compute container.
Feb 23 08:16:31 np0005626463.localdomain sudo[78076]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:32 np0005626463.localdomain sudo[78213]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fstaznsbpogwqxogmelntprqecwalifa ; /usr/bin/python3
Feb 23 08:16:32 np0005626463.localdomain sudo[78213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:32 np0005626463.localdomain python3[78215]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:16:32 np0005626463.localdomain sudo[78213]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:32 np0005626463.localdomain sudo[78261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eomdganaxfrgmjjsqmgfqjymaxqczmbq ; /usr/bin/python3
Feb 23 08:16:32 np0005626463.localdomain sudo[78261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:33 np0005626463.localdomain sudo[78261]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:33 np0005626463.localdomain sudo[78304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emcxctduofblrxxhxqkpqovtxprjdkdq ; /usr/bin/python3
Feb 23 08:16:33 np0005626463.localdomain sudo[78304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:33 np0005626463.localdomain sudo[78304]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:33 np0005626463.localdomain sudo[78334]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbfxqepgpcvhqzmkclmbyuazkwdhdapl ; /usr/bin/python3
Feb 23 08:16:33 np0005626463.localdomain sudo[78334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:33 np0005626463.localdomain python3[78336]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005626463 step=5 update_config_hash_only=False
Feb 23 08:16:33 np0005626463.localdomain sudo[78334]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:34 np0005626463.localdomain sudo[78350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwdwnfbjjzhnqfpoacwfvkvqvkvhbsae ; /usr/bin/python3
Feb 23 08:16:34 np0005626463.localdomain sudo[78350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:34 np0005626463.localdomain python3[78352]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 08:16:34 np0005626463.localdomain sudo[78350]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:34 np0005626463.localdomain sudo[78366]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgymiunhctwfpoyknrcinxazcliaocdc ; /usr/bin/python3
Feb 23 08:16:34 np0005626463.localdomain sudo[78366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 23 08:16:34 np0005626463.localdomain python3[78368]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 23 08:16:34 np0005626463.localdomain sudo[78366]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:35 np0005626463.localdomain sudo[78369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:16:35 np0005626463.localdomain sudo[78369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:16:35 np0005626463.localdomain sudo[78369]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:35 np0005626463.localdomain sudo[78384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:16:35 np0005626463.localdomain sudo[78384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:16:36 np0005626463.localdomain sudo[78384]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:37 np0005626463.localdomain sudo[78430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:16:37 np0005626463.localdomain sudo[78430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:16:37 np0005626463.localdomain sudo[78430]: pam_unix(sudo:session): session closed for user root
Feb 23 08:16:45 np0005626463.localdomain sshd[78445]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:16:46 np0005626463.localdomain sshd[78445]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:16:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:16:47 np0005626463.localdomain systemd[1]: tmp-crun.0c1HGW.mount: Deactivated successfully.
Feb 23 08:16:47 np0005626463.localdomain podman[78447]: 2026-02-23 08:16:47.928572198 +0000 UTC m=+0.097199466 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-collectd-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 23 08:16:47 np0005626463.localdomain podman[78447]: 2026-02-23 08:16:47.967625226 +0000 UTC m=+0.136252514 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.)
Feb 23 08:16:47 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: tmp-crun.jz71Tp.mount: Deactivated successfully.
Feb 23 08:16:50 np0005626463.localdomain podman[78466]: 2026-02-23 08:16:50.93108544 +0000 UTC m=+0.098135374 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 23 08:16:50 np0005626463.localdomain podman[78466]: 2026-02-23 08:16:50.959193985 +0000 UTC m=+0.126243969 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute)
Feb 23 08:16:50 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:16:50 np0005626463.localdomain podman[78467]: 2026-02-23 08:16:50.978926585 +0000 UTC m=+0.142619238 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 23 08:16:51 np0005626463.localdomain podman[78465]: 2026-02-23 08:16:51.02778556 +0000 UTC m=+0.197017501 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:16:51 np0005626463.localdomain podman[78467]: 2026-02-23 08:16:51.032252776 +0000 UTC m=+0.195945369 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:16:51 np0005626463.localdomain podman[78465]: 2026-02-23 08:16:51.041150596 +0000 UTC m=+0.210382487 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z)
Feb 23 08:16:51 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:16:51 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:16:51 np0005626463.localdomain podman[78468]: 2026-02-23 08:16:51.088953 +0000 UTC m=+0.250689453 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-cron)
Feb 23 08:16:51 np0005626463.localdomain podman[78470]: 2026-02-23 08:16:50.949920893 +0000 UTC m=+0.104754656 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_compute)
Feb 23 08:16:51 np0005626463.localdomain podman[78468]: 2026-02-23 08:16:51.125190912 +0000 UTC m=+0.286927295 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:16:51 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:16:51 np0005626463.localdomain podman[78470]: 2026-02-23 08:16:51.180530634 +0000 UTC m=+0.335364397 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_compute, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:16:51 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:16:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:16:54 np0005626463.localdomain podman[78581]: 2026-02-23 08:16:54.903945395 +0000 UTC m=+0.079298221 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:16:55 np0005626463.localdomain podman[78581]: 2026-02-23 08:16:55.291388745 +0000 UTC m=+0.466741561 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:16:55 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:16:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:16:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:16:56 np0005626463.localdomain systemd[1]: tmp-crun.1wTx0l.mount: Deactivated successfully.
Feb 23 08:16:56 np0005626463.localdomain podman[78604]: 2026-02-23 08:16:56.927951906 +0000 UTC m=+0.094802554 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z)
Feb 23 08:16:56 np0005626463.localdomain podman[78605]: 2026-02-23 08:16:56.977843042 +0000 UTC m=+0.144184424 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:16:57 np0005626463.localdomain podman[78604]: 2026-02-23 08:16:57.031387101 +0000 UTC m=+0.198237749 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:16:57 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:16:57 np0005626463.localdomain podman[78605]: 2026-02-23 08:16:57.082295528 +0000 UTC m=+0.248636920 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1)
Feb 23 08:16:57 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:16:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:16:57 np0005626463.localdomain podman[78652]: 2026-02-23 08:16:57.89373393 +0000 UTC m=+0.071585876 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1)
Feb 23 08:16:58 np0005626463.localdomain podman[78652]: 2026-02-23 08:16:58.084381266 +0000 UTC m=+0.262233152 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 23 08:16:58 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:16:59 np0005626463.localdomain sshd[78681]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:16:59 np0005626463.localdomain sshd[78681]: Accepted publickey for zuul from 192.168.122.100 port 40568 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 08:16:59 np0005626463.localdomain systemd-logind[759]: New session 33 of user zuul.
Feb 23 08:16:59 np0005626463.localdomain systemd[1]: Started Session 33 of User zuul.
Feb 23 08:16:59 np0005626463.localdomain sshd[78681]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 08:16:59 np0005626463.localdomain sudo[78788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtqnubfemsymonhzhkwifoxdosirolmy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771834619.3853106-41411-63611633644850/AnsiballZ_setup.py
Feb 23 08:16:59 np0005626463.localdomain sudo[78788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:17:00 np0005626463.localdomain python3[78790]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 08:17:02 np0005626463.localdomain sudo[78788]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:04 np0005626463.localdomain sshd[78977]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:17:05 np0005626463.localdomain sshd[78977]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:17:07 np0005626463.localdomain sudo[79053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icbwmkfaizyqmwkycjyipvpydnwwkqrk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771834627.3193696-41502-132177914073346/AnsiballZ_dnf.py
Feb 23 08:17:07 np0005626463.localdomain sudo[79053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:17:07 np0005626463.localdomain python3[79055]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Feb 23 08:17:10 np0005626463.localdomain sudo[79053]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:15 np0005626463.localdomain sudo[79146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnxydvkazaonyxsuzzspmdmmbrvlczzy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771834634.8535433-41558-191285412412371/AnsiballZ_iptables.py
Feb 23 08:17:15 np0005626463.localdomain sudo[79146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:17:15 np0005626463.localdomain python3[79148]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Feb 23 08:17:15 np0005626463.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 23 08:17:15 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Feb 23 08:17:15 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 08:17:15 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 08:17:15 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 08:17:15 np0005626463.localdomain sudo[79146]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:17:18 np0005626463.localdomain systemd[1]: tmp-crun.RoIJ2P.mount: Deactivated successfully.
Feb 23 08:17:18 np0005626463.localdomain podman[79216]: 2026-02-23 08:17:18.923422639 +0000 UTC m=+0.089117700 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:17:18 np0005626463.localdomain podman[79216]: 2026-02-23 08:17:18.939301932 +0000 UTC m=+0.104996953 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true)
Feb 23 08:17:18 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:17:21 np0005626463.localdomain podman[79236]: 2026-02-23 08:17:21.932027776 +0000 UTC m=+0.104492568 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:17:21 np0005626463.localdomain podman[79236]: 2026-02-23 08:17:21.970260509 +0000 UTC m=+0.142725341 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.13, container_name=iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 23 08:17:21 np0005626463.localdomain podman[79239]: 2026-02-23 08:17:21.984494151 +0000 UTC m=+0.147267089 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4)
Feb 23 08:17:21 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:17:22 np0005626463.localdomain podman[79239]: 2026-02-23 08:17:22.021352762 +0000 UTC m=+0.184125680 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Feb 23 08:17:22 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:17:22 np0005626463.localdomain podman[79237]: 2026-02-23 08:17:22.039330218 +0000 UTC m=+0.206893861 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Feb 23 08:17:22 np0005626463.localdomain podman[79237]: 2026-02-23 08:17:22.076312373 +0000 UTC m=+0.243875996 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:17:22 np0005626463.localdomain podman[79238]: 2026-02-23 08:17:22.085891254 +0000 UTC m=+0.250280270 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com)
Feb 23 08:17:22 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:17:22 np0005626463.localdomain podman[79238]: 2026-02-23 08:17:22.140941108 +0000 UTC m=+0.305330094 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:17:22 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:17:22 np0005626463.localdomain podman[79245]: 2026-02-23 08:17:22.142717932 +0000 UTC m=+0.298694062 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:17:22 np0005626463.localdomain podman[79245]: 2026-02-23 08:17:22.223646612 +0000 UTC m=+0.379622752 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:17:22 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:17:24 np0005626463.localdomain sshd[79352]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:17:25 np0005626463.localdomain sshd[79352]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:17:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:17:25 np0005626463.localdomain podman[79354]: 2026-02-23 08:17:25.943757131 +0000 UTC m=+0.087803402 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:17:26 np0005626463.localdomain podman[79354]: 2026-02-23 08:17:26.351383756 +0000 UTC m=+0.495430097 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1)
Feb 23 08:17:26 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:17:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:17:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:17:27 np0005626463.localdomain podman[79377]: 2026-02-23 08:17:27.911733514 +0000 UTC m=+0.085745060 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1)
Feb 23 08:17:27 np0005626463.localdomain podman[79378]: 2026-02-23 08:17:27.967683125 +0000 UTC m=+0.138008708 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Feb 23 08:17:27 np0005626463.localdomain podman[79377]: 2026-02-23 08:17:27.98864758 +0000 UTC m=+0.162659086 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:17:28 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:17:28 np0005626463.localdomain podman[79378]: 2026-02-23 08:17:28.043564941 +0000 UTC m=+0.213890524 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git)
Feb 23 08:17:28 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:17:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:17:28 np0005626463.localdomain systemd[1]: tmp-crun.K8mHri.mount: Deactivated successfully.
Feb 23 08:17:28 np0005626463.localdomain podman[79426]: 2026-02-23 08:17:28.908978603 +0000 UTC m=+0.081931273 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64)
Feb 23 08:17:29 np0005626463.localdomain podman[79426]: 2026-02-23 08:17:29.116553821 +0000 UTC m=+0.289506401 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Feb 23 08:17:29 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:17:37 np0005626463.localdomain sudo[79456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:17:37 np0005626463.localdomain sudo[79456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:17:37 np0005626463.localdomain sudo[79456]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:37 np0005626463.localdomain sudo[79471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:17:37 np0005626463.localdomain sudo[79471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:17:38 np0005626463.localdomain sudo[79471]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:38 np0005626463.localdomain sudo[79518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:17:38 np0005626463.localdomain sudo[79518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:17:38 np0005626463.localdomain sudo[79518]: pam_unix(sudo:session): session closed for user root
Feb 23 08:17:47 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:17:47 np0005626463.localdomain recover_tripleo_nova_virtqemud[79534]: 61982
Feb 23 08:17:47 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:17:47 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:17:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:17:49 np0005626463.localdomain podman[79535]: 2026-02-23 08:17:49.925261486 +0000 UTC m=+0.094108247 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, version=17.1.13)
Feb 23 08:17:49 np0005626463.localdomain podman[79535]: 2026-02-23 08:17:49.96762928 +0000 UTC m=+0.136476111 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:17:49 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:17:52 np0005626463.localdomain podman[79555]: 2026-02-23 08:17:52.925827276 +0000 UTC m=+0.096812900 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc.)
Feb 23 08:17:52 np0005626463.localdomain podman[79555]: 2026-02-23 08:17:52.963737172 +0000 UTC m=+0.134723016 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 23 08:17:52 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:17:52 np0005626463.localdomain podman[79558]: 2026-02-23 08:17:52.986237345 +0000 UTC m=+0.147877731 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:17:53 np0005626463.localdomain podman[79558]: 2026-02-23 08:17:53.021116839 +0000 UTC m=+0.182757255 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:17:53 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:17:53 np0005626463.localdomain podman[79557]: 2026-02-23 08:17:53.035531172 +0000 UTC m=+0.198886971 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 23 08:17:53 np0005626463.localdomain podman[79557]: 2026-02-23 08:17:53.072333675 +0000 UTC m=+0.235689474 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:17:53 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:17:53 np0005626463.localdomain podman[79564]: 2026-02-23 08:17:53.09069082 +0000 UTC m=+0.248716465 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, batch=17.1_20260112.1, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:17:53 np0005626463.localdomain podman[79564]: 2026-02-23 08:17:53.122172329 +0000 UTC m=+0.280197964 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:17:53 np0005626463.localdomain podman[79556]: 2026-02-23 08:17:53.133449896 +0000 UTC m=+0.298457407 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:17:53 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:17:53 np0005626463.localdomain podman[79556]: 2026-02-23 08:17:53.195571797 +0000 UTC m=+0.360579358 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:17:53 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:17:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:17:56 np0005626463.localdomain podman[79669]: 2026-02-23 08:17:56.900017098 +0000 UTC m=+0.077342782 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-type=git)
Feb 23 08:17:57 np0005626463.localdomain podman[79669]: 2026-02-23 08:17:57.286351166 +0000 UTC m=+0.463676860 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:17:57 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:17:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:17:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:17:58 np0005626463.localdomain podman[79691]: 2026-02-23 08:17:58.916427551 +0000 UTC m=+0.084649156 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13)
Feb 23 08:17:58 np0005626463.localdomain podman[79691]: 2026-02-23 08:17:58.969464523 +0000 UTC m=+0.137686128 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, container_name=ovn_controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:17:58 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:17:59 np0005626463.localdomain sshd[79734]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:17:59 np0005626463.localdomain podman[79692]: 2026-02-23 08:17:58.973810387 +0000 UTC m=+0.137779720 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Feb 23 08:17:59 np0005626463.localdomain podman[79692]: 2026-02-23 08:17:59.057224374 +0000 UTC m=+0.221193757 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 23 08:17:59 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:17:59 np0005626463.localdomain sshd[79734]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:17:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:17:59 np0005626463.localdomain systemd[1]: tmp-crun.oCvP5e.mount: Deactivated successfully.
Feb 23 08:17:59 np0005626463.localdomain podman[79741]: 2026-02-23 08:17:59.445307207 +0000 UTC m=+0.097365998 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 23 08:17:59 np0005626463.localdomain podman[79741]: 2026-02-23 08:17:59.703730509 +0000 UTC m=+0.355789350 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, release=1766032510, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 23 08:17:59 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:18:03 np0005626463.localdomain sshd[79770]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:18:03 np0005626463.localdomain sshd[79770]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:18:15 np0005626463.localdomain sshd[78684]: Received disconnect from 192.168.122.100 port 40568:11: disconnected by user
Feb 23 08:18:15 np0005626463.localdomain sshd[78684]: Disconnected from user zuul 192.168.122.100 port 40568
Feb 23 08:18:15 np0005626463.localdomain sshd[78681]: pam_unix(sshd:session): session closed for user zuul
Feb 23 08:18:15 np0005626463.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Feb 23 08:18:15 np0005626463.localdomain systemd[1]: session-33.scope: Consumed 6.062s CPU time.
Feb 23 08:18:15 np0005626463.localdomain systemd-logind[759]: Session 33 logged out. Waiting for processes to exit.
Feb 23 08:18:15 np0005626463.localdomain systemd-logind[759]: Removed session 33.
Feb 23 08:18:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:18:20 np0005626463.localdomain podman[79817]: 2026-02-23 08:18:20.917999246 +0000 UTC m=+0.091415745 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step3, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, managed_by=tripleo_ansible)
Feb 23 08:18:20 np0005626463.localdomain podman[79817]: 2026-02-23 08:18:20.927774307 +0000 UTC m=+0.101190796 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:10:15Z)
Feb 23 08:18:20 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain sshd[79837]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:18:23 np0005626463.localdomain sshd[79837]: Accepted publickey for zuul from 38.102.83.114 port 34676 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 08:18:23 np0005626463.localdomain systemd-logind[759]: New session 34 of user zuul.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: Started Session 34 of User zuul.
Feb 23 08:18:23 np0005626463.localdomain sshd[79837]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: tmp-crun.N3d5pf.mount: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain podman[79840]: 2026-02-23 08:18:23.405054423 +0000 UTC m=+0.098515933 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:18:23 np0005626463.localdomain podman[79839]: 2026-02-23 08:18:23.468349911 +0000 UTC m=+0.163668648 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:18:23 np0005626463.localdomain podman[79840]: 2026-02-23 08:18:23.489356727 +0000 UTC m=+0.182818237 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:18:23 np0005626463.localdomain sudo[79942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzapamdhmxlrbnbootbeertmvporqokp ; /usr/bin/python3
Feb 23 08:18:23 np0005626463.localdomain sudo[79942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain podman[79843]: 2026-02-23 08:18:23.569837224 +0000 UTC m=+0.254364259 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:18:23 np0005626463.localdomain podman[79839]: 2026-02-23 08:18:23.579584853 +0000 UTC m=+0.274903610 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Feb 23 08:18:23 np0005626463.localdomain podman[79841]: 2026-02-23 08:18:23.429232457 +0000 UTC m=+0.107826220 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:18:23 np0005626463.localdomain podman[79841]: 2026-02-23 08:18:23.613330092 +0000 UTC m=+0.291923905 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 23 08:18:23 np0005626463.localdomain podman[79842]: 2026-02-23 08:18:23.620356659 +0000 UTC m=+0.307442293 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:18:23 np0005626463.localdomain podman[79842]: 2026-02-23 08:18:23.624585058 +0000 UTC m=+0.311670672 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, version=17.1.13, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain podman[79843]: 2026-02-23 08:18:23.675290348 +0000 UTC m=+0.359817383 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 23 08:18:23 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:18:23 np0005626463.localdomain python3[79949]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 08:18:26 np0005626463.localdomain sudo[79942]: pam_unix(sudo:session): session closed for user root
Feb 23 08:18:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:18:27 np0005626463.localdomain systemd[1]: tmp-crun.BsxNpl.mount: Deactivated successfully.
Feb 23 08:18:27 np0005626463.localdomain podman[79972]: 2026-02-23 08:18:27.921689185 +0000 UTC m=+0.093575827 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:18:28 np0005626463.localdomain podman[79972]: 2026-02-23 08:18:28.305021034 +0000 UTC m=+0.476907686 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 23 08:18:28 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:18:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:18:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:18:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:18:29 np0005626463.localdomain podman[79998]: 2026-02-23 08:18:29.94882503 +0000 UTC m=+0.120070433 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:18:30 np0005626463.localdomain systemd[1]: tmp-crun.qwNFhJ.mount: Deactivated successfully.
Feb 23 08:18:30 np0005626463.localdomain podman[79997]: 2026-02-23 08:18:30.052127992 +0000 UTC m=+0.223595743 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, url=https://www.redhat.com, architecture=x86_64)
Feb 23 08:18:30 np0005626463.localdomain podman[79996]: 2026-02-23 08:18:30.015471447 +0000 UTC m=+0.189920857 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:18:30 np0005626463.localdomain podman[79996]: 2026-02-23 08:18:30.094658022 +0000 UTC m=+0.269107402 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 23 08:18:30 np0005626463.localdomain podman[79997]: 2026-02-23 08:18:30.103426715 +0000 UTC m=+0.274894496 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z)
Feb 23 08:18:30 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:18:30 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:18:30 np0005626463.localdomain podman[79998]: 2026-02-23 08:18:30.172283046 +0000 UTC m=+0.343528409 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 23 08:18:30 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:18:39 np0005626463.localdomain sudo[80071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:18:39 np0005626463.localdomain sudo[80071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:18:39 np0005626463.localdomain sudo[80071]: pam_unix(sudo:session): session closed for user root
Feb 23 08:18:39 np0005626463.localdomain sudo[80086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:18:39 np0005626463.localdomain sudo[80086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:18:39 np0005626463.localdomain sudo[80086]: pam_unix(sudo:session): session closed for user root
Feb 23 08:18:41 np0005626463.localdomain sshd[80133]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:18:42 np0005626463.localdomain sshd[80133]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:18:42 np0005626463.localdomain sudo[80135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:18:42 np0005626463.localdomain sudo[80135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:18:42 np0005626463.localdomain sudo[80135]: pam_unix(sudo:session): session closed for user root
Feb 23 08:18:50 np0005626463.localdomain sshd[80150]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:18:50 np0005626463.localdomain sudo[80164]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztmkdhqibntadjjvjcijkxllfyxxgfey ; /usr/bin/python3
Feb 23 08:18:50 np0005626463.localdomain sudo[80164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 08:18:50 np0005626463.localdomain python3[80166]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 23 08:18:51 np0005626463.localdomain sshd[80150]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:18:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:18:51 np0005626463.localdomain systemd[1]: tmp-crun.ya2VV0.mount: Deactivated successfully.
Feb 23 08:18:51 np0005626463.localdomain podman[80169]: 2026-02-23 08:18:51.739376371 +0000 UTC m=+0.092900465 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, tcib_managed=true)
Feb 23 08:18:51 np0005626463.localdomain podman[80169]: 2026-02-23 08:18:51.750445941 +0000 UTC m=+0.103970085 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, container_name=collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:18:51 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:18:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:18:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:18:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:18:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:18:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:18:53 np0005626463.localdomain podman[80192]: 2026-02-23 08:18:53.93694345 +0000 UTC m=+0.095532283 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi)
Feb 23 08:18:53 np0005626463.localdomain podman[80190]: 2026-02-23 08:18:53.98781242 +0000 UTC m=+0.152759116 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z)
Feb 23 08:18:54 np0005626463.localdomain podman[80190]: 2026-02-23 08:18:54.000320677 +0000 UTC m=+0.165267343 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:18:54 np0005626463.localdomain podman[80192]: 2026-02-23 08:18:54.019084735 +0000 UTC m=+0.177673528 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4)
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: tmp-crun.Jzu0Ds.mount: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain podman[80191]: 2026-02-23 08:18:54.041018498 +0000 UTC m=+0.202828949 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain podman[80199]: 2026-02-23 08:18:54.097854647 +0000 UTC m=+0.245507764 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 23 08:18:54 np0005626463.localdomain podman[80191]: 2026-02-23 08:18:54.102360168 +0000 UTC m=+0.264170639 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true)
Feb 23 08:18:54 np0005626463.localdomain podman[80199]: 2026-02-23 08:18:54.129198654 +0000 UTC m=+0.276851751 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain podman[80193]: 2026-02-23 08:18:54.146751511 +0000 UTC m=+0.302009032 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain podman[80193]: 2026-02-23 08:18:54.208419792 +0000 UTC m=+0.363677293 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z)
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 08:18:54 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 08:18:55 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 08:18:55 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 08:18:55 np0005626463.localdomain systemd[1]: run-r18d9633a04e94dbfa7cfbb53935b3b73.service: Deactivated successfully.
Feb 23 08:18:55 np0005626463.localdomain systemd[1]: run-rb9280c561b0b4d5d88eacf720b321e1b.service: Deactivated successfully.
Feb 23 08:18:55 np0005626463.localdomain sudo[80164]: pam_unix(sudo:session): session closed for user root
Feb 23 08:18:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:18:58 np0005626463.localdomain podman[80452]: 2026-02-23 08:18:58.911618371 +0000 UTC m=+0.083759628 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 23 08:18:59 np0005626463.localdomain podman[80452]: 2026-02-23 08:18:59.301261332 +0000 UTC m=+0.473402539 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64)
Feb 23 08:18:59 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:19:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:19:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:19:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:19:00 np0005626463.localdomain systemd[1]: tmp-crun.sEBIiq.mount: Deactivated successfully.
Feb 23 08:19:00 np0005626463.localdomain podman[80476]: 2026-02-23 08:19:00.924664656 +0000 UTC m=+0.093606099 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 23 08:19:00 np0005626463.localdomain podman[80475]: 2026-02-23 08:19:00.975476903 +0000 UTC m=+0.145164181 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:19:01 np0005626463.localdomain podman[80475]: 2026-02-23 08:19:01.003267842 +0000 UTC m=+0.172955170 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller)
Feb 23 08:19:01 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:19:01 np0005626463.localdomain podman[80477]: 2026-02-23 08:19:01.032473859 +0000 UTC m=+0.195744072 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1)
Feb 23 08:19:01 np0005626463.localdomain podman[80476]: 2026-02-23 08:19:01.060822386 +0000 UTC m=+0.229763869 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:19:01 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:19:01 np0005626463.localdomain podman[80477]: 2026-02-23 08:19:01.223233972 +0000 UTC m=+0.386504155 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 23 08:19:01 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:19:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:19:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4530 writes, 20K keys, 4530 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4530 writes, 464 syncs, 9.76 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:19:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:19:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 5013 writes, 22K keys, 5013 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5013 writes, 561 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:19:20 np0005626463.localdomain sshd[80573]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:19:21 np0005626463.localdomain sshd[80573]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:19:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:19:21 np0005626463.localdomain podman[80598]: 2026-02-23 08:19:21.914476203 +0000 UTC m=+0.080955256 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, vcs-type=git)
Feb 23 08:19:21 np0005626463.localdomain podman[80598]: 2026-02-23 08:19:21.929175124 +0000 UTC m=+0.095654187 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Feb 23 08:19:21 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: tmp-crun.aWIijc.mount: Deactivated successfully.
Feb 23 08:19:24 np0005626463.localdomain podman[80624]: 2026-02-23 08:19:24.940147701 +0000 UTC m=+0.098308136 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true)
Feb 23 08:19:24 np0005626463.localdomain podman[80624]: 2026-02-23 08:19:24.974283982 +0000 UTC m=+0.132444447 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:19:24 np0005626463.localdomain podman[80619]: 2026-02-23 08:19:24.988276589 +0000 UTC m=+0.155998154 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible)
Feb 23 08:19:24 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:19:25 np0005626463.localdomain podman[80619]: 2026-02-23 08:19:25.030408367 +0000 UTC m=+0.198129942 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:19:25 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:19:25 np0005626463.localdomain podman[80621]: 2026-02-23 08:19:25.03680071 +0000 UTC m=+0.201058459 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Feb 23 08:19:25 np0005626463.localdomain podman[80620]: 2026-02-23 08:19:25.097463977 +0000 UTC m=+0.264261110 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:19:25 np0005626463.localdomain podman[80622]: 2026-02-23 08:19:25.15471182 +0000 UTC m=+0.312808853 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true)
Feb 23 08:19:25 np0005626463.localdomain podman[80621]: 2026-02-23 08:19:25.171270503 +0000 UTC m=+0.335528272 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, batch=17.1_20260112.1)
Feb 23 08:19:25 np0005626463.localdomain podman[80620]: 2026-02-23 08:19:25.179929813 +0000 UTC m=+0.346726976 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:19:25 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:19:25 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:19:25 np0005626463.localdomain podman[80622]: 2026-02-23 08:19:25.193282969 +0000 UTC m=+0.351380022 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:19:25 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:19:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:19:29 np0005626463.localdomain podman[80738]: 2026-02-23 08:19:29.915340951 +0000 UTC m=+0.090807416 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=nova_migration_target)
Feb 23 08:19:30 np0005626463.localdomain sudo[80774]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzktfyvhwqsamiiedblnqxjeduzezgzu ; /usr/bin/python3
Feb 23 08:19:30 np0005626463.localdomain sudo[80774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 08:19:30 np0005626463.localdomain python3[80776]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:19:30 np0005626463.localdomain podman[80738]: 2026-02-23 08:19:30.356397408 +0000 UTC m=+0.531863873 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 23 08:19:30 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:19:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:19:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:19:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:19:31 np0005626463.localdomain podman[80780]: 2026-02-23 08:19:31.926009506 +0000 UTC m=+0.093908380 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, container_name=ovn_controller, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:19:31 np0005626463.localdomain systemd[1]: tmp-crun.7wcZPK.mount: Deactivated successfully.
Feb 23 08:19:31 np0005626463.localdomain podman[80781]: 2026-02-23 08:19:31.970942863 +0000 UTC m=+0.137474121 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 23 08:19:32 np0005626463.localdomain podman[80780]: 2026-02-23 08:19:32.024668313 +0000 UTC m=+0.192567237 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 23 08:19:32 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:19:32 np0005626463.localdomain podman[80781]: 2026-02-23 08:19:32.052664963 +0000 UTC m=+0.219196221 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 23 08:19:32 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:19:32 np0005626463.localdomain podman[80782]: 2026-02-23 08:19:32.034372681 +0000 UTC m=+0.196926625 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, release=1766032510, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Feb 23 08:19:32 np0005626463.localdomain podman[80782]: 2026-02-23 08:19:32.257305795 +0000 UTC m=+0.419859659 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:19:32 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:19:33 np0005626463.localdomain rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 08:19:33 np0005626463.localdomain rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 08:19:37 np0005626463.localdomain sshd[81040]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:19:37 np0005626463.localdomain sudo[80774]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:43 np0005626463.localdomain sudo[81042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:19:43 np0005626463.localdomain sudo[81042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:19:43 np0005626463.localdomain sudo[81042]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:43 np0005626463.localdomain sudo[81057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 08:19:43 np0005626463.localdomain sudo[81057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:19:43 np0005626463.localdomain sudo[81057]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:43 np0005626463.localdomain sudo[81094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:19:43 np0005626463.localdomain sudo[81094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:19:43 np0005626463.localdomain sudo[81094]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:43 np0005626463.localdomain sudo[81109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:19:43 np0005626463.localdomain sudo[81109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:19:44 np0005626463.localdomain sudo[81109]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:45 np0005626463.localdomain sudo[81155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:19:45 np0005626463.localdomain sudo[81155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:19:45 np0005626463.localdomain sudo[81155]: pam_unix(sudo:session): session closed for user root
Feb 23 08:19:47 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:19:47 np0005626463.localdomain recover_tripleo_nova_virtqemud[81171]: 61982
Feb 23 08:19:47 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:19:47 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:19:50 np0005626463.localdomain sshd[81040]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:19:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:19:52 np0005626463.localdomain systemd[1]: tmp-crun.KgZfbP.mount: Deactivated successfully.
Feb 23 08:19:52 np0005626463.localdomain podman[81172]: 2026-02-23 08:19:52.932519788 +0000 UTC m=+0.103587501 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.)
Feb 23 08:19:52 np0005626463.localdomain podman[81172]: 2026-02-23 08:19:52.943848854 +0000 UTC m=+0.114916537 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, version=17.1.13, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Feb 23 08:19:52 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:19:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:19:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:19:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:19:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:19:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:19:55 np0005626463.localdomain podman[81194]: 2026-02-23 08:19:55.893161205 +0000 UTC m=+0.071438838 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:19:55 np0005626463.localdomain podman[81196]: 2026-02-23 08:19:55.953048216 +0000 UTC m=+0.125305232 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13)
Feb 23 08:19:55 np0005626463.localdomain podman[81195]: 2026-02-23 08:19:55.935014495 +0000 UTC m=+0.104805089 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, release=1766032510, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 23 08:19:56 np0005626463.localdomain podman[81197]: 2026-02-23 08:19:56.026987236 +0000 UTC m=+0.195041699 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:19:56 np0005626463.localdomain podman[81194]: 2026-02-23 08:19:56.031553583 +0000 UTC m=+0.209831266 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:19:56 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:19:56 np0005626463.localdomain podman[81196]: 2026-02-23 08:19:56.054231215 +0000 UTC m=+0.226488261 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Feb 23 08:19:56 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:19:56 np0005626463.localdomain podman[81203]: 2026-02-23 08:19:56.00593439 +0000 UTC m=+0.173316025 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:19:56 np0005626463.localdomain podman[81197]: 2026-02-23 08:19:56.110176525 +0000 UTC m=+0.278231038 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:19:56 np0005626463.localdomain podman[81195]: 2026-02-23 08:19:56.115375708 +0000 UTC m=+0.285166252 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:07:47Z)
Feb 23 08:19:56 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:19:56 np0005626463.localdomain podman[81203]: 2026-02-23 08:19:56.135626351 +0000 UTC m=+0.303008006 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:19:56 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:19:56 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:20:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:20:00 np0005626463.localdomain systemd[1]: tmp-crun.r9jBjk.mount: Deactivated successfully.
Feb 23 08:20:00 np0005626463.localdomain podman[81309]: 2026-02-23 08:20:00.915586571 +0000 UTC m=+0.089782272 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:20:01 np0005626463.localdomain sshd[81333]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:20:01 np0005626463.localdomain podman[81309]: 2026-02-23 08:20:01.321030166 +0000 UTC m=+0.495225837 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target)
Feb 23 08:20:01 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:20:01 np0005626463.localdomain sshd[81333]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:20:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:20:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:20:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:20:02 np0005626463.localdomain systemd[1]: tmp-crun.sOrWRS.mount: Deactivated successfully.
Feb 23 08:20:02 np0005626463.localdomain podman[81335]: 2026-02-23 08:20:02.928673938 +0000 UTC m=+0.099824464 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4)
Feb 23 08:20:02 np0005626463.localdomain podman[81336]: 2026-02-23 08:20:02.981402865 +0000 UTC m=+0.149563139 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13)
Feb 23 08:20:03 np0005626463.localdomain podman[81337]: 2026-02-23 08:20:03.027896547 +0000 UTC m=+0.193054497 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:20:03 np0005626463.localdomain podman[81335]: 2026-02-23 08:20:03.034627273 +0000 UTC m=+0.205777799 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 23 08:20:03 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:20:03 np0005626463.localdomain podman[81336]: 2026-02-23 08:20:03.05450781 +0000 UTC m=+0.222668154 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z)
Feb 23 08:20:03 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:20:03 np0005626463.localdomain podman[81337]: 2026-02-23 08:20:03.233324431 +0000 UTC m=+0.398482331 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:20:03 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:20:03 np0005626463.localdomain systemd[1]: tmp-crun.Vcr12f.mount: Deactivated successfully.
Feb 23 08:20:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:20:23 np0005626463.localdomain podman[81453]: 2026-02-23 08:20:23.909174826 +0000 UTC m=+0.088140444 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1)
Feb 23 08:20:23 np0005626463.localdomain podman[81453]: 2026-02-23 08:20:23.924343403 +0000 UTC m=+0.103308991 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, container_name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5)
Feb 23 08:20:23 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:20:26 np0005626463.localdomain sudo[81488]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzupooynqvbbudgiwujbbnuiqmhatpkp ; /usr/bin/python3
Feb 23 08:20:26 np0005626463.localdomain sudo[81488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:20:26 np0005626463.localdomain podman[81493]: 2026-02-23 08:20:26.585113906 +0000 UTC m=+0.095905585 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, build-date=2026-01-12T23:07:30Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com)
Feb 23 08:20:26 np0005626463.localdomain podman[81491]: 2026-02-23 08:20:26.637541975 +0000 UTC m=+0.155368259 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible)
Feb 23 08:20:26 np0005626463.localdomain podman[81491]: 2026-02-23 08:20:26.646078008 +0000 UTC m=+0.163904282 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5)
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:20:26 np0005626463.localdomain python3[81490]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:20:26 np0005626463.localdomain podman[81494]: 2026-02-23 08:20:26.68794483 +0000 UTC m=+0.195269717 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, version=17.1.13, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:20:26 np0005626463.localdomain podman[81500]: 2026-02-23 08:20:26.737682704 +0000 UTC m=+0.243610857 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Feb 23 08:20:26 np0005626463.localdomain podman[81500]: 2026-02-23 08:20:26.796816728 +0000 UTC m=+0.302744941 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:20:26 np0005626463.localdomain podman[81493]: 2026-02-23 08:20:26.817736829 +0000 UTC m=+0.328528478 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13)
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:20:26 np0005626463.localdomain podman[81492]: 2026-02-23 08:20:26.799186624 +0000 UTC m=+0.314088764 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container)
Feb 23 08:20:26 np0005626463.localdomain podman[81494]: 2026-02-23 08:20:26.870174408 +0000 UTC m=+0.377499315 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-cron-container)
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:20:26 np0005626463.localdomain podman[81492]: 2026-02-23 08:20:26.883906438 +0000 UTC m=+0.398808588 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:20:26 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:20:29 np0005626463.localdomain rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 08:20:29 np0005626463.localdomain rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 08:20:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:20:31 np0005626463.localdomain systemd[1]: tmp-crun.oLuVa7.mount: Deactivated successfully.
Feb 23 08:20:31 np0005626463.localdomain podman[81736]: 2026-02-23 08:20:31.918932874 +0000 UTC m=+0.095018516 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:20:32 np0005626463.localdomain podman[81736]: 2026-02-23 08:20:32.341359928 +0000 UTC m=+0.517445560 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:20:32 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:20:33 np0005626463.localdomain sudo[81488]: pam_unix(sudo:session): session closed for user root
Feb 23 08:20:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:20:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:20:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:20:33 np0005626463.localdomain podman[81816]: 2026-02-23 08:20:33.919032258 +0000 UTC m=+0.090882133 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:20:33 np0005626463.localdomain podman[81817]: 2026-02-23 08:20:33.967753799 +0000 UTC m=+0.136687951 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, version=17.1.13, container_name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:20:33 np0005626463.localdomain podman[81816]: 2026-02-23 08:20:33.974454093 +0000 UTC m=+0.146303978 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:20:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:20:34 np0005626463.localdomain podman[81818]: 2026-02-23 08:20:34.033975741 +0000 UTC m=+0.198792170 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 23 08:20:34 np0005626463.localdomain podman[81817]: 2026-02-23 08:20:34.045580133 +0000 UTC m=+0.214514285 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:20:34 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:20:34 np0005626463.localdomain podman[81818]: 2026-02-23 08:20:34.23555144 +0000 UTC m=+0.400367859 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5)
Feb 23 08:20:34 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:20:35 np0005626463.localdomain sshd[81892]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:20:42 np0005626463.localdomain sshd[81894]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:20:42 np0005626463.localdomain sshd[81894]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:20:43 np0005626463.localdomain sshd[81892]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:20:45 np0005626463.localdomain sudo[81896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:20:45 np0005626463.localdomain sudo[81896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:20:45 np0005626463.localdomain sudo[81896]: pam_unix(sudo:session): session closed for user root
Feb 23 08:20:45 np0005626463.localdomain sudo[81911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:20:45 np0005626463.localdomain sudo[81911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:20:46 np0005626463.localdomain sudo[81911]: pam_unix(sudo:session): session closed for user root
Feb 23 08:20:46 np0005626463.localdomain sudo[81957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:20:46 np0005626463.localdomain sudo[81957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:20:46 np0005626463.localdomain sudo[81957]: pam_unix(sudo:session): session closed for user root
Feb 23 08:20:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:20:54 np0005626463.localdomain podman[81972]: 2026-02-23 08:20:54.937378432 +0000 UTC m=+0.097345520 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:20:54 np0005626463.localdomain podman[81972]: 2026-02-23 08:20:54.953135799 +0000 UTC m=+0.113102867 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:20:54 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:20:56 np0005626463.localdomain podman[81993]: 2026-02-23 08:20:56.914133721 +0000 UTC m=+0.088500602 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, architecture=x86_64, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:20:56 np0005626463.localdomain podman[81993]: 2026-02-23 08:20:56.951950684 +0000 UTC m=+0.126317585 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, release=1766032510)
Feb 23 08:20:56 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:20:56 np0005626463.localdomain podman[81994]: 2026-02-23 08:20:56.968481244 +0000 UTC m=+0.137338471 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:20:57 np0005626463.localdomain podman[81994]: 2026-02-23 08:20:57.003303903 +0000 UTC m=+0.172161120 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step5)
Feb 23 08:20:57 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:20:57 np0005626463.localdomain systemd[1]: tmp-crun.DUYInV.mount: Deactivated successfully.
Feb 23 08:20:57 np0005626463.localdomain podman[82025]: 2026-02-23 08:20:57.092244557 +0000 UTC m=+0.147453860 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Feb 23 08:20:57 np0005626463.localdomain podman[82025]: 2026-02-23 08:20:57.126704673 +0000 UTC m=+0.181913946 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:20:57 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:20:57 np0005626463.localdomain podman[82023]: 2026-02-23 08:20:57.152698413 +0000 UTC m=+0.213604266 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:20:57 np0005626463.localdomain podman[82023]: 2026-02-23 08:20:57.188382488 +0000 UTC m=+0.249288371 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:20:57 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:20:57 np0005626463.localdomain podman[82024]: 2026-02-23 08:20:57.191734134 +0000 UTC m=+0.249187228 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git)
Feb 23 08:20:57 np0005626463.localdomain podman[82024]: 2026-02-23 08:20:57.277380075 +0000 UTC m=+0.334833149 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Feb 23 08:20:57 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:21:01 np0005626463.localdomain python3[82120]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 23 08:21:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:21:02 np0005626463.localdomain podman[82121]: 2026-02-23 08:21:02.918133129 +0000 UTC m=+0.086523309 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:21:03 np0005626463.localdomain podman[82121]: 2026-02-23 08:21:03.310322785 +0000 UTC m=+0.478712965 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:21:03 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:21:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:21:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:21:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:21:04 np0005626463.localdomain systemd[1]: tmp-crun.IbM3pm.mount: Deactivated successfully.
Feb 23 08:21:04 np0005626463.localdomain podman[82146]: 2026-02-23 08:21:04.920917259 +0000 UTC m=+0.093770417 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Feb 23 08:21:04 np0005626463.localdomain podman[82145]: 2026-02-23 08:21:04.969960536 +0000 UTC m=+0.144979892 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:21:04 np0005626463.localdomain podman[82145]: 2026-02-23 08:21:04.99927443 +0000 UTC m=+0.174293826 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:21:05 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:21:05 np0005626463.localdomain podman[82146]: 2026-02-23 08:21:05.050307099 +0000 UTC m=+0.223160257 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:21:05 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:21:05 np0005626463.localdomain podman[82147]: 2026-02-23 08:21:05.128400041 +0000 UTC m=+0.297552232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:21:05 np0005626463.localdomain podman[82147]: 2026-02-23 08:21:05.358424464 +0000 UTC m=+0.527576675 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:21:05 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:21:24 np0005626463.localdomain sshd[82241]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:21:24 np0005626463.localdomain sshd[82241]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:21:25 np0005626463.localdomain sshd[82268]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:21:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:21:25 np0005626463.localdomain podman[82270]: 2026-02-23 08:21:25.908082505 +0000 UTC m=+0.081652526 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:21:25 np0005626463.localdomain podman[82270]: 2026-02-23 08:21:25.920405653 +0000 UTC m=+0.093975664 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:21:25 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:21:26 np0005626463.localdomain sshd[82268]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: tmp-crun.WXLoHE.mount: Deactivated successfully.
Feb 23 08:21:27 np0005626463.localdomain podman[82289]: 2026-02-23 08:21:27.933750955 +0000 UTC m=+0.103590037 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 23 08:21:27 np0005626463.localdomain podman[82289]: 2026-02-23 08:21:27.965473655 +0000 UTC m=+0.135312737 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:21:27 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:21:27 np0005626463.localdomain podman[82288]: 2026-02-23 08:21:27.978023602 +0000 UTC m=+0.152324315 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 23 08:21:28 np0005626463.localdomain podman[82288]: 2026-02-23 08:21:28.016458193 +0000 UTC m=+0.190758846 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 23 08:21:28 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:21:28 np0005626463.localdomain podman[82297]: 2026-02-23 08:21:28.039246312 +0000 UTC m=+0.200990779 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git)
Feb 23 08:21:28 np0005626463.localdomain podman[82291]: 2026-02-23 08:21:28.085066436 +0000 UTC m=+0.249738895 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:21:28 np0005626463.localdomain podman[82291]: 2026-02-23 08:21:28.096171577 +0000 UTC m=+0.260844036 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, container_name=logrotate_crond, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:21:28 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:21:28 np0005626463.localdomain podman[82290]: 2026-02-23 08:21:28.193865437 +0000 UTC m=+0.362350547 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:21:28 np0005626463.localdomain podman[82297]: 2026-02-23 08:21:28.206154074 +0000 UTC m=+0.367898591 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 23 08:21:28 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:21:28 np0005626463.localdomain podman[82290]: 2026-02-23 08:21:28.233512237 +0000 UTC m=+0.401997397 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:21:28 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:21:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:21:33 np0005626463.localdomain systemd[1]: tmp-crun.bV5F5G.mount: Deactivated successfully.
Feb 23 08:21:33 np0005626463.localdomain podman[82404]: 2026-02-23 08:21:33.91922999 +0000 UTC m=+0.091489636 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 23 08:21:34 np0005626463.localdomain podman[82404]: 2026-02-23 08:21:34.307407519 +0000 UTC m=+0.479667175 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, container_name=nova_migration_target, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:21:34 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:21:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:21:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:21:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:21:35 np0005626463.localdomain systemd[1]: tmp-crun.4enAOv.mount: Deactivated successfully.
Feb 23 08:21:35 np0005626463.localdomain podman[82429]: 2026-02-23 08:21:35.933086167 +0000 UTC m=+0.098729354 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:21:35 np0005626463.localdomain systemd[1]: tmp-crun.LyIpLx.mount: Deactivated successfully.
Feb 23 08:21:35 np0005626463.localdomain podman[82427]: 2026-02-23 08:21:35.977323302 +0000 UTC m=+0.145579711 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13)
Feb 23 08:21:36 np0005626463.localdomain podman[82427]: 2026-02-23 08:21:36.004256942 +0000 UTC m=+0.172513381 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:21:36 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:21:36 np0005626463.localdomain podman[82428]: 2026-02-23 08:21:36.024700806 +0000 UTC m=+0.191979284 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:21:36 np0005626463.localdomain podman[82428]: 2026-02-23 08:21:36.065490193 +0000 UTC m=+0.232768701 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:21:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:21:36 np0005626463.localdomain podman[82429]: 2026-02-23 08:21:36.153488067 +0000 UTC m=+0.319131294 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:21:36 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:21:46 np0005626463.localdomain sudo[82501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:21:46 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:21:46 np0005626463.localdomain sudo[82501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:21:46 np0005626463.localdomain sudo[82501]: pam_unix(sudo:session): session closed for user root
Feb 23 08:21:46 np0005626463.localdomain recover_tripleo_nova_virtqemud[82517]: 61982
Feb 23 08:21:46 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:21:46 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:21:46 np0005626463.localdomain sudo[82518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:21:46 np0005626463.localdomain sudo[82518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:21:47 np0005626463.localdomain podman[82606]: 2026-02-23 08:21:47.844412647 +0000 UTC m=+0.139842875 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.openshift.expose-services=, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 08:21:48 np0005626463.localdomain podman[82606]: 2026-02-23 08:21:48.009786215 +0000 UTC m=+0.305216453 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, version=7, CEPH_POINT_RELEASE=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, release=1770267347, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Feb 23 08:21:48 np0005626463.localdomain sudo[82518]: pam_unix(sudo:session): session closed for user root
Feb 23 08:21:48 np0005626463.localdomain sudo[82675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:21:48 np0005626463.localdomain sudo[82675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:21:48 np0005626463.localdomain sudo[82675]: pam_unix(sudo:session): session closed for user root
Feb 23 08:21:48 np0005626463.localdomain sudo[82690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:21:48 np0005626463.localdomain sudo[82690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:21:49 np0005626463.localdomain sudo[82690]: pam_unix(sudo:session): session closed for user root
Feb 23 08:21:49 np0005626463.localdomain sudo[82736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:21:49 np0005626463.localdomain sudo[82736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:21:49 np0005626463.localdomain sudo[82736]: pam_unix(sudo:session): session closed for user root
Feb 23 08:21:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:21:56 np0005626463.localdomain podman[82751]: 2026-02-23 08:21:56.937245674 +0000 UTC m=+0.099188332 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:21:56 np0005626463.localdomain podman[82751]: 2026-02-23 08:21:56.949815765 +0000 UTC m=+0.111758403 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:21:56 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:21:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:21:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:21:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:21:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:21:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:21:58 np0005626463.localdomain podman[82773]: 2026-02-23 08:21:58.944912958 +0000 UTC m=+0.101784223 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: tmp-crun.7jXv00.mount: Deactivated successfully.
Feb 23 08:21:59 np0005626463.localdomain podman[82776]: 2026-02-23 08:21:59.003939169 +0000 UTC m=+0.152126398 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=)
Feb 23 08:21:59 np0005626463.localdomain podman[82776]: 2026-02-23 08:21:59.041274882 +0000 UTC m=+0.189462051 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:21:59 np0005626463.localdomain podman[82782]: 2026-02-23 08:21:59.060852493 +0000 UTC m=+0.202126175 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T23:32:04Z)
Feb 23 08:21:59 np0005626463.localdomain podman[82782]: 2026-02-23 08:21:59.098528006 +0000 UTC m=+0.239801738 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, architecture=x86_64, vcs-type=git, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T23:32:04Z)
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:21:59 np0005626463.localdomain podman[82774]: 2026-02-23 08:21:59.121375986 +0000 UTC m=+0.276925786 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z)
Feb 23 08:21:59 np0005626463.localdomain podman[82775]: 2026-02-23 08:21:59.162281765 +0000 UTC m=+0.314839863 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:21:59 np0005626463.localdomain podman[82773]: 2026-02-23 08:21:59.18344789 +0000 UTC m=+0.340319165 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:21:59 np0005626463.localdomain podman[82775]: 2026-02-23 08:21:59.219308703 +0000 UTC m=+0.371866831 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible)
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:21:59 np0005626463.localdomain podman[82774]: 2026-02-23 08:21:59.233730144 +0000 UTC m=+0.389279924 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 23 08:21:59 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:22:01 np0005626463.localdomain sshd[79891]: Received disconnect from 38.102.83.114 port 34676:11: disconnected by user
Feb 23 08:22:01 np0005626463.localdomain sshd[79891]: Disconnected from user zuul 38.102.83.114 port 34676
Feb 23 08:22:01 np0005626463.localdomain sshd[79837]: pam_unix(sshd:session): session closed for user zuul
Feb 23 08:22:01 np0005626463.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Feb 23 08:22:01 np0005626463.localdomain systemd[1]: session-34.scope: Consumed 20.336s CPU time.
Feb 23 08:22:01 np0005626463.localdomain systemd-logind[759]: Session 34 logged out. Waiting for processes to exit.
Feb 23 08:22:01 np0005626463.localdomain systemd-logind[759]: Removed session 34.
Feb 23 08:22:04 np0005626463.localdomain sshd[82887]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:22:04 np0005626463.localdomain sshd[82887]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:22:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:22:05 np0005626463.localdomain podman[82889]: 2026-02-23 08:22:05.06798026 +0000 UTC m=+0.061838899 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:22:05 np0005626463.localdomain podman[82889]: 2026-02-23 08:22:05.588919185 +0000 UTC m=+0.582777854 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git)
Feb 23 08:22:05 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:22:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:22:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:22:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:22:06 np0005626463.localdomain podman[82912]: 2026-02-23 08:22:06.900974426 +0000 UTC m=+0.078062569 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Feb 23 08:22:06 np0005626463.localdomain podman[82913]: 2026-02-23 08:22:06.957252713 +0000 UTC m=+0.131480629 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team)
Feb 23 08:22:07 np0005626463.localdomain podman[82914]: 2026-02-23 08:22:07.017545669 +0000 UTC m=+0.188860146 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:22:07 np0005626463.localdomain podman[82912]: 2026-02-23 08:22:07.023560563 +0000 UTC m=+0.200648746 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z)
Feb 23 08:22:07 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:22:07 np0005626463.localdomain podman[82913]: 2026-02-23 08:22:07.038326693 +0000 UTC m=+0.212554599 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 23 08:22:07 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:22:07 np0005626463.localdomain podman[82914]: 2026-02-23 08:22:07.246283545 +0000 UTC m=+0.417598022 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:22:07 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:22:07 np0005626463.localdomain systemd[1]: tmp-crun.0y7OLF.mount: Deactivated successfully.
Feb 23 08:22:13 np0005626463.localdomain sshd[82990]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:22:15 np0005626463.localdomain sshd[82990]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:22:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:22:27 np0005626463.localdomain podman[83037]: 2026-02-23 08:22:27.911003188 +0000 UTC m=+0.082654774 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=collectd, url=https://www.redhat.com)
Feb 23 08:22:27 np0005626463.localdomain podman[83037]: 2026-02-23 08:22:27.923156558 +0000 UTC m=+0.094808144 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, release=1766032510)
Feb 23 08:22:27 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:22:29 np0005626463.localdomain podman[83058]: 2026-02-23 08:22:29.928139021 +0000 UTC m=+0.096998384 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: tmp-crun.Rg73MD.mount: Deactivated successfully.
Feb 23 08:22:29 np0005626463.localdomain podman[83057]: 2026-02-23 08:22:29.97638694 +0000 UTC m=+0.145777347 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=iscsid, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:22:29 np0005626463.localdomain podman[83058]: 2026-02-23 08:22:29.986278058 +0000 UTC m=+0.155137351 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:22:29 np0005626463.localdomain podman[83071]: 2026-02-23 08:22:29.942937802 +0000 UTC m=+0.098594176 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, release=1766032510, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Feb 23 08:22:29 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:22:29 np0005626463.localdomain podman[83057]: 2026-02-23 08:22:29.99925077 +0000 UTC m=+0.168641187 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:22:30 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:22:30 np0005626463.localdomain podman[83071]: 2026-02-23 08:22:30.023977661 +0000 UTC m=+0.179634015 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:22:30 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:22:30 np0005626463.localdomain podman[83060]: 2026-02-23 08:22:30.09434852 +0000 UTC m=+0.253870019 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:22:30 np0005626463.localdomain podman[83060]: 2026-02-23 08:22:30.107184899 +0000 UTC m=+0.266706368 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Feb 23 08:22:30 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:22:30 np0005626463.localdomain podman[83059]: 2026-02-23 08:22:30.004387399 +0000 UTC m=+0.166278312 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510)
Feb 23 08:22:30 np0005626463.localdomain podman[83059]: 2026-02-23 08:22:30.19533144 +0000 UTC m=+0.357222383 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:22:30 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:22:30 np0005626463.localdomain systemd[1]: tmp-crun.lRKrUR.mount: Deactivated successfully.
Feb 23 08:22:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:22:35 np0005626463.localdomain systemd[1]: tmp-crun.9kG0e1.mount: Deactivated successfully.
Feb 23 08:22:35 np0005626463.localdomain podman[83173]: 2026-02-23 08:22:35.929464959 +0000 UTC m=+0.102168702 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target)
Feb 23 08:22:36 np0005626463.localdomain podman[83173]: 2026-02-23 08:22:36.287006711 +0000 UTC m=+0.459710444 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 23 08:22:36 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:22:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:22:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:22:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:22:37 np0005626463.localdomain podman[83197]: 2026-02-23 08:22:37.914181663 +0000 UTC m=+0.085476731 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:22:37 np0005626463.localdomain podman[83197]: 2026-02-23 08:22:37.945710008 +0000 UTC m=+0.117005126 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.expose-services=)
Feb 23 08:22:37 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:22:37 np0005626463.localdomain podman[83199]: 2026-02-23 08:22:37.938249335 +0000 UTC m=+0.102452910 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:22:38 np0005626463.localdomain podman[83198]: 2026-02-23 08:22:38.019051798 +0000 UTC m=+0.186817760 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 23 08:22:38 np0005626463.localdomain podman[83198]: 2026-02-23 08:22:38.065373345 +0000 UTC m=+0.233139277 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, version=17.1.13, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 23 08:22:38 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:22:38 np0005626463.localdomain podman[83199]: 2026-02-23 08:22:38.113328836 +0000 UTC m=+0.277532381 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:22:38 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:22:43 np0005626463.localdomain sshd[83273]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:22:43 np0005626463.localdomain sshd[83273]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:22:49 np0005626463.localdomain sudo[83275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:22:49 np0005626463.localdomain sudo[83275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:22:49 np0005626463.localdomain sudo[83275]: pam_unix(sudo:session): session closed for user root
Feb 23 08:22:49 np0005626463.localdomain sudo[83290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:22:49 np0005626463.localdomain sudo[83290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:22:50 np0005626463.localdomain sudo[83290]: pam_unix(sudo:session): session closed for user root
Feb 23 08:22:51 np0005626463.localdomain sudo[83336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:22:51 np0005626463.localdomain sudo[83336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:22:51 np0005626463.localdomain sudo[83336]: pam_unix(sudo:session): session closed for user root
Feb 23 08:22:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:22:59 np0005626463.localdomain systemd[1]: tmp-crun.Wl1Wew.mount: Deactivated successfully.
Feb 23 08:22:59 np0005626463.localdomain podman[83351]: 2026-02-23 08:22:59.019790598 +0000 UTC m=+0.099596292 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64)
Feb 23 08:22:59 np0005626463.localdomain podman[83351]: 2026-02-23 08:22:59.057417431 +0000 UTC m=+0.137223085 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Feb 23 08:22:59 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:23:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:23:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:23:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:23:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:23:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:23:00 np0005626463.localdomain podman[83373]: 2026-02-23 08:23:00.950697298 +0000 UTC m=+0.110622844 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:23:00 np0005626463.localdomain podman[83374]: 2026-02-23 08:23:00.9972465 +0000 UTC m=+0.154217060 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:23:01 np0005626463.localdomain podman[83372]: 2026-02-23 08:23:01.046995694 +0000 UTC m=+0.210001279 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13)
Feb 23 08:23:01 np0005626463.localdomain podman[83372]: 2026-02-23 08:23:01.058324714 +0000 UTC m=+0.221330329 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:23:01 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:23:01 np0005626463.localdomain podman[83381]: 2026-02-23 08:23:01.10894533 +0000 UTC m=+0.259486395 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:23:01 np0005626463.localdomain podman[83374]: 2026-02-23 08:23:01.112547193 +0000 UTC m=+0.269517753 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13)
Feb 23 08:23:01 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:23:01 np0005626463.localdomain podman[83373]: 2026-02-23 08:23:01.163645201 +0000 UTC m=+0.323570747 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Feb 23 08:23:01 np0005626463.localdomain podman[83381]: 2026-02-23 08:23:01.172019835 +0000 UTC m=+0.322560910 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:23:01 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:23:01 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:23:01 np0005626463.localdomain podman[83375]: 2026-02-23 08:23:01.16793515 +0000 UTC m=+0.307357010 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 23 08:23:01 np0005626463.localdomain podman[83375]: 2026-02-23 08:23:01.248467082 +0000 UTC m=+0.387888902 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 23 08:23:01 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:23:03 np0005626463.localdomain sshd[83580]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:23:05 np0005626463.localdomain sshd[83580]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:23:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:23:06 np0005626463.localdomain podman[83622]: 2026-02-23 08:23:06.903259652 +0000 UTC m=+0.079327503 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:23:07 np0005626463.localdomain podman[83622]: 2026-02-23 08:23:07.32267654 +0000 UTC m=+0.498744461 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, release=1766032510)
Feb 23 08:23:07 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:23:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:23:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:23:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:23:08 np0005626463.localdomain podman[83872]: 2026-02-23 08:23:08.911099661 +0000 UTC m=+0.077190347 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1)
Feb 23 08:23:08 np0005626463.localdomain podman[83873]: 2026-02-23 08:23:08.970286647 +0000 UTC m=+0.133533290 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, distribution-scope=public)
Feb 23 08:23:08 np0005626463.localdomain podman[83872]: 2026-02-23 08:23:08.980266072 +0000 UTC m=+0.146356698 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:23:08 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:23:09 np0005626463.localdomain podman[83871]: 2026-02-23 08:23:09.030007926 +0000 UTC m=+0.196403230 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:23:09 np0005626463.localdomain podman[83871]: 2026-02-23 08:23:09.078345903 +0000 UTC m=+0.244741227 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:23:09 np0005626463.localdomain podman[83873]: 2026-02-23 08:23:09.153172279 +0000 UTC m=+0.316418902 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:23:09 np0005626463.localdomain sudo[83967]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpoappegw_/privsep.sock
Feb 23 08:23:09 np0005626463.localdomain systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Queued start job for default target Main User Target.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Created slice User Application Slice.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Reached target Paths.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Reached target Timers.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Starting D-Bus User Message Bus Socket...
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Starting Create User's Volatile Files and Directories...
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Listening on D-Bus User Message Bus Socket.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Finished Create User's Volatile Files and Directories.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Reached target Sockets.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Reached target Basic System.
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Started User Manager for UID 0.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Reached target Main User Target.
Feb 23 08:23:09 np0005626463.localdomain systemd[83969]: Startup finished in 165ms.
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: Started Session c11 of User root.
Feb 23 08:23:09 np0005626463.localdomain sudo[83967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 23 08:23:09 np0005626463.localdomain systemd[1]: tmp-crun.1AQc6g.mount: Deactivated successfully.
Feb 23 08:23:10 np0005626463.localdomain sudo[83967]: pam_unix(sudo:session): session closed for user root
Feb 23 08:23:10 np0005626463.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 23 08:23:10 np0005626463.localdomain kernel: device tapa27e5011-20 entered promiscuous mode
Feb 23 08:23:10 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834990.8260] manager: (tapa27e5011-20): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Feb 23 08:23:10 np0005626463.localdomain systemd-udevd[84004]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 08:23:10 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834990.8493] device (tapa27e5011-20): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 08:23:10 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834990.8535] device (tapa27e5011-20): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 23 08:23:10 np0005626463.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 23 08:23:10 np0005626463.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 23 08:23:10 np0005626463.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 23 08:23:10 np0005626463.localdomain systemd-machined[84014]: New machine qemu-1-instance-00000003.
Feb 23 08:23:10 np0005626463.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Feb 23 08:23:11 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834991.1179] manager: (tap9da5b53d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Feb 23 08:23:11 np0005626463.localdomain systemd-udevd[84003]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 08:23:11 np0005626463.localdomain NetworkManager[5974]: <info>  [1771834991.1767] device (tap9da5b53d-30): carrier: link connected
Feb 23 08:23:11 np0005626463.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-31: link becomes ready
Feb 23 08:23:11 np0005626463.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-30: link becomes ready
Feb 23 08:23:11 np0005626463.localdomain kernel: device tap9da5b53d-30 entered promiscuous mode
Feb 23 08:23:12 np0005626463.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 23 08:23:12 np0005626463.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 23 08:23:13 np0005626463.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 23 08:23:13 np0005626463.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 23 08:23:13 np0005626463.localdomain sudo[84125]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d haproxy -f /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 08:23:13 np0005626463.localdomain sudo[84125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 23 08:23:13 np0005626463.localdomain podman[84151]: 2026-02-23 08:23:13.828582553 +0000 UTC m=+0.104585439 container create f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:23:13 np0005626463.localdomain podman[84151]: 2026-02-23 08:23:13.778388248 +0000 UTC m=+0.054391174 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 23 08:23:13 np0005626463.localdomain systemd[1]: Started libpod-conmon-f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b.scope.
Feb 23 08:23:13 np0005626463.localdomain systemd[1]: tmp-crun.AYFn2b.mount: Deactivated successfully.
Feb 23 08:23:13 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:23:13 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3b968a22d6dac5274c225974669ffbf9fd10e196a31be0e89003b3aedfce825/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 08:23:13 np0005626463.localdomain podman[84151]: 2026-02-23 08:23:13.949288874 +0000 UTC m=+0.225291760 container init f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 23 08:23:13 np0005626463.localdomain podman[84151]: 2026-02-23 08:23:13.959536067 +0000 UTC m=+0.235538953 container start f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:23:14 np0005626463.localdomain sudo[84125]: pam_unix(sudo:session): session closed for user root
Feb 23 08:23:14 np0005626463.localdomain setroubleshoot[84108]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l a176daef-12e3-44f0-9641-bedf749d0981
Feb 23 08:23:14 np0005626463.localdomain setroubleshoot[84108]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Feb 23 08:23:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 08:23:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 08:23:23 np0005626463.localdomain sshd[84177]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:23:23 np0005626463.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 23 08:23:23 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:23:23 np0005626463.localdomain recover_tripleo_nova_virtqemud[84180]: 61982
Feb 23 08:23:23 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:23:23 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:23:23 np0005626463.localdomain sshd[84177]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:23:24 np0005626463.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 23 08:23:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:23:29 np0005626463.localdomain podman[84228]: 2026-02-23 08:23:29.925478903 +0000 UTC m=+0.095642700 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64)
Feb 23 08:23:29 np0005626463.localdomain podman[84228]: 2026-02-23 08:23:29.96640764 +0000 UTC m=+0.136571467 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, container_name=collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:23:29 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:23:31 np0005626463.localdomain systemd[1]: tmp-crun.Mlj77o.mount: Deactivated successfully.
Feb 23 08:23:31 np0005626463.localdomain podman[84251]: 2026-02-23 08:23:31.929752462 +0000 UTC m=+0.096524842 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, release=1766032510, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:23:32 np0005626463.localdomain podman[84251]: 2026-02-23 08:23:32.215289033 +0000 UTC m=+0.382061424 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:23:32 np0005626463.localdomain podman[84253]: 2026-02-23 08:23:32.218007743 +0000 UTC m=+0.379511828 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:23:32 np0005626463.localdomain podman[84250]: 2026-02-23 08:23:32.223784431 +0000 UTC m=+0.392148792 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 23 08:23:32 np0005626463.localdomain podman[84253]: 2026-02-23 08:23:32.246996846 +0000 UTC m=+0.408500911 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_compute)
Feb 23 08:23:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:23:32 np0005626463.localdomain podman[84250]: 2026-02-23 08:23:32.285240325 +0000 UTC m=+0.453604666 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 23 08:23:32 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:23:32 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:23:32 np0005626463.localdomain podman[84252]: 2026-02-23 08:23:32.219954603 +0000 UTC m=+0.383389808 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z)
Feb 23 08:23:32 np0005626463.localdomain podman[84249]: 2026-02-23 08:23:32.220285762 +0000 UTC m=+0.390002197 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:23:32 np0005626463.localdomain podman[84249]: 2026-02-23 08:23:32.354223451 +0000 UTC m=+0.523939866 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:23:32 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:23:32 np0005626463.localdomain podman[84252]: 2026-02-23 08:23:32.403313478 +0000 UTC m=+0.566748763 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64)
Feb 23 08:23:32 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40768 [23/Feb/2026:08:23:29.582] listener listener/metadata 0/0/0/4559/4559 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40784 [23/Feb/2026:08:23:34.244] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40786 [23/Feb/2026:08:23:34.299] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40798 [23/Feb/2026:08:23:34.349] listener listener/metadata 0/0/0/11/11 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40812 [23/Feb/2026:08:23:34.400] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40828 [23/Feb/2026:08:23:34.448] listener listener/metadata 0/0/0/11/11 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40832 [23/Feb/2026:08:23:34.496] listener listener/metadata 0/0/0/11/11 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40836 [23/Feb/2026:08:23:34.543] listener listener/metadata 0/0/0/9/9 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40846 [23/Feb/2026:08:23:34.589] listener listener/metadata 0/0/0/9/9 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40850 [23/Feb/2026:08:23:34.636] listener listener/metadata 0/0/0/10/10 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40864 [23/Feb/2026:08:23:34.685] listener listener/metadata 0/0/0/10/10 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40872 [23/Feb/2026:08:23:34.726] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40888 [23/Feb/2026:08:23:34.767] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40900 [23/Feb/2026:08:23:34.807] listener listener/metadata 0/0/0/9/9 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40906 [23/Feb/2026:08:23:34.856] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Feb 23 08:23:34 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40918 [23/Feb/2026:08:23:34.907] listener listener/metadata 0/0/0/10/10 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Feb 23 08:23:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:23:37 np0005626463.localdomain systemd[1]: tmp-crun.aVNruL.mount: Deactivated successfully.
Feb 23 08:23:37 np0005626463.localdomain podman[84366]: 2026-02-23 08:23:37.929238639 +0000 UTC m=+0.091068074 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-type=git)
Feb 23 08:23:38 np0005626463.localdomain podman[84366]: 2026-02-23 08:23:38.313324123 +0000 UTC m=+0.475153568 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 23 08:23:38 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:23:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:23:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:23:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:23:39 np0005626463.localdomain systemd[1]: tmp-crun.wpvKUk.mount: Deactivated successfully.
Feb 23 08:23:39 np0005626463.localdomain podman[84389]: 2026-02-23 08:23:39.921307255 +0000 UTC m=+0.092132781 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_controller, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:23:39 np0005626463.localdomain podman[84390]: 2026-02-23 08:23:39.963356141 +0000 UTC m=+0.131339654 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z)
Feb 23 08:23:40 np0005626463.localdomain podman[84390]: 2026-02-23 08:23:40.007097942 +0000 UTC m=+0.175081455 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13)
Feb 23 08:23:40 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:23:40 np0005626463.localdomain podman[84391]: 2026-02-23 08:23:40.021863139 +0000 UTC m=+0.187612824 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 23 08:23:40 np0005626463.localdomain podman[84389]: 2026-02-23 08:23:40.041859941 +0000 UTC m=+0.212685477 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 23 08:23:40 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:23:40 np0005626463.localdomain podman[84391]: 2026-02-23 08:23:40.184206846 +0000 UTC m=+0.349956511 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1)
Feb 23 08:23:40 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:23:46 np0005626463.localdomain sshd[84464]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:23:48 np0005626463.localdomain sshd[84464]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:23:51 np0005626463.localdomain sudo[84466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:23:51 np0005626463.localdomain sudo[84466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:23:51 np0005626463.localdomain sudo[84466]: pam_unix(sudo:session): session closed for user root
Feb 23 08:23:51 np0005626463.localdomain sudo[84481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:23:51 np0005626463.localdomain sudo[84481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:23:52 np0005626463.localdomain sudo[84481]: pam_unix(sudo:session): session closed for user root
Feb 23 08:23:52 np0005626463.localdomain sudo[84528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:23:52 np0005626463.localdomain sudo[84528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:23:52 np0005626463.localdomain sudo[84528]: pam_unix(sudo:session): session closed for user root
Feb 23 08:24:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:24:00 np0005626463.localdomain podman[84543]: 2026-02-23 08:24:00.920965878 +0000 UTC m=+0.091761022 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, distribution-scope=public)
Feb 23 08:24:00 np0005626463.localdomain podman[84543]: 2026-02-23 08:24:00.931990546 +0000 UTC m=+0.102785660 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team)
Feb 23 08:24:00 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:24:01 np0005626463.localdomain sshd[84563]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:24:01 np0005626463.localdomain sshd[84563]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:24:02 np0005626463.localdomain systemd[1]: tmp-crun.mPJ1Lx.mount: Deactivated successfully.
Feb 23 08:24:02 np0005626463.localdomain podman[84567]: 2026-02-23 08:24:02.936091692 +0000 UTC m=+0.098522480 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1)
Feb 23 08:24:02 np0005626463.localdomain podman[84566]: 2026-02-23 08:24:02.982549785 +0000 UTC m=+0.147077237 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:24:03 np0005626463.localdomain podman[84565]: 2026-02-23 08:24:03.027607685 +0000 UTC m=+0.192957513 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, distribution-scope=public, version=17.1.13, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Feb 23 08:24:03 np0005626463.localdomain podman[84566]: 2026-02-23 08:24:03.042227773 +0000 UTC m=+0.206755195 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Feb 23 08:24:03 np0005626463.localdomain podman[84567]: 2026-02-23 08:24:03.050968422 +0000 UTC m=+0.213399250 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible)
Feb 23 08:24:03 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:24:03 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:24:03 np0005626463.localdomain podman[84568]: 2026-02-23 08:24:03.092620067 +0000 UTC m=+0.248838824 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:24:03 np0005626463.localdomain podman[84568]: 2026-02-23 08:24:03.104250974 +0000 UTC m=+0.260469721 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git)
Feb 23 08:24:03 np0005626463.localdomain podman[84565]: 2026-02-23 08:24:03.113624621 +0000 UTC m=+0.278974479 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:24:03 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:24:03 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:24:03 np0005626463.localdomain podman[84569]: 2026-02-23 08:24:03.198587654 +0000 UTC m=+0.351746928 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 23 08:24:03 np0005626463.localdomain podman[84569]: 2026-02-23 08:24:03.229967976 +0000 UTC m=+0.383127250 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:24:03 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:24:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:24:08 np0005626463.localdomain podman[84677]: 2026-02-23 08:24:08.918195183 +0000 UTC m=+0.091006579 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 23 08:24:09 np0005626463.localdomain podman[84677]: 2026-02-23 08:24:09.30646847 +0000 UTC m=+0.479279866 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, release=1766032510, managed_by=tripleo_ansible)
Feb 23 08:24:09 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:24:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:24:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:24:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:24:10 np0005626463.localdomain systemd[1]: tmp-crun.zpSC71.mount: Deactivated successfully.
Feb 23 08:24:10 np0005626463.localdomain podman[84700]: 2026-02-23 08:24:10.93176851 +0000 UTC m=+0.100404038 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:24:10 np0005626463.localdomain podman[84699]: 2026-02-23 08:24:10.980596975 +0000 UTC m=+0.152094181 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Feb 23 08:24:11 np0005626463.localdomain podman[84698]: 2026-02-23 08:24:11.029426092 +0000 UTC m=+0.204407414 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Feb 23 08:24:11 np0005626463.localdomain podman[84698]: 2026-02-23 08:24:11.060395371 +0000 UTC m=+0.235376733 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1)
Feb 23 08:24:11 np0005626463.localdomain podman[84699]: 2026-02-23 08:24:11.067527509 +0000 UTC m=+0.239024695 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 23 08:24:11 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:24:11 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:24:11 np0005626463.localdomain podman[84700]: 2026-02-23 08:24:11.154368869 +0000 UTC m=+0.323004357 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=metrics_qdr, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:24:11 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:24:11 np0005626463.localdomain systemd[1]: tmp-crun.FWIFAv.mount: Deactivated successfully.
Feb 23 08:24:31 np0005626463.localdomain sshd[84817]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:24:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:24:31 np0005626463.localdomain systemd[1]: tmp-crun.X4sZI7.mount: Deactivated successfully.
Feb 23 08:24:31 np0005626463.localdomain podman[84819]: 2026-02-23 08:24:31.924495444 +0000 UTC m=+0.095320391 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:24:31 np0005626463.localdomain podman[84819]: 2026-02-23 08:24:31.932730257 +0000 UTC m=+0.103555224 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, container_name=collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3)
Feb 23 08:24:31 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:24:31 np0005626463.localdomain sshd[84817]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:24:33 np0005626463.localdomain systemd[1]: tmp-crun.AWbBme.mount: Deactivated successfully.
Feb 23 08:24:33 np0005626463.localdomain podman[84841]: 2026-02-23 08:24:33.938312188 +0000 UTC m=+0.101501531 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1766032510, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 23 08:24:33 np0005626463.localdomain podman[84842]: 2026-02-23 08:24:33.97655212 +0000 UTC m=+0.136859135 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:24:34 np0005626463.localdomain podman[84841]: 2026-02-23 08:24:34.021337342 +0000 UTC m=+0.184526645 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:24:34 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:24:34 np0005626463.localdomain podman[84844]: 2026-02-23 08:24:34.027321415 +0000 UTC m=+0.183809213 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:24:34 np0005626463.localdomain podman[84839]: 2026-02-23 08:24:34.092931655 +0000 UTC m=+0.260294705 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13)
Feb 23 08:24:34 np0005626463.localdomain podman[84839]: 2026-02-23 08:24:34.106407808 +0000 UTC m=+0.273770868 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container)
Feb 23 08:24:34 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:24:34 np0005626463.localdomain podman[84840]: 2026-02-23 08:24:33.956035932 +0000 UTC m=+0.121917337 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 23 08:24:34 np0005626463.localdomain podman[84842]: 2026-02-23 08:24:34.162299791 +0000 UTC m=+0.322606816 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:15Z)
Feb 23 08:24:34 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:24:34 np0005626463.localdomain podman[84840]: 2026-02-23 08:24:34.192444645 +0000 UTC m=+0.358326040 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:24:34 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:24:34 np0005626463.localdomain podman[84844]: 2026-02-23 08:24:34.214196512 +0000 UTC m=+0.370684340 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:24:34 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:24:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:24:39 np0005626463.localdomain podman[84953]: 2026-02-23 08:24:39.909373661 +0000 UTC m=+0.083589693 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:24:40 np0005626463.localdomain sshd[84974]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:24:40 np0005626463.localdomain podman[84953]: 2026-02-23 08:24:40.288750074 +0000 UTC m=+0.462966106 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:24:40 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:24:40 np0005626463.localdomain sshd[84974]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:24:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:24:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:24:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:24:41 np0005626463.localdomain sshd[84977]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:24:41 np0005626463.localdomain systemd[1]: tmp-crun.LdO701.mount: Deactivated successfully.
Feb 23 08:24:41 np0005626463.localdomain podman[84979]: 2026-02-23 08:24:41.908698129 +0000 UTC m=+0.080189257 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:24:41 np0005626463.localdomain podman[84978]: 2026-02-23 08:24:41.929246569 +0000 UTC m=+0.101021186 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:24:42 np0005626463.localdomain podman[84976]: 2026-02-23 08:24:42.013616074 +0000 UTC m=+0.188138775 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z)
Feb 23 08:24:42 np0005626463.localdomain podman[84976]: 2026-02-23 08:24:42.035592868 +0000 UTC m=+0.210115589 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13)
Feb 23 08:24:42 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:24:42 np0005626463.localdomain podman[84978]: 2026-02-23 08:24:42.092556733 +0000 UTC m=+0.264331410 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z)
Feb 23 08:24:42 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:24:42 np0005626463.localdomain podman[84979]: 2026-02-23 08:24:42.151311713 +0000 UTC m=+0.322802821 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:24:42 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:24:42 np0005626463.localdomain systemd[1]: tmp-crun.6aubgE.mount: Deactivated successfully.
Feb 23 08:24:43 np0005626463.localdomain sshd[84977]: Invalid user guest from 185.156.73.233 port 43858
Feb 23 08:24:43 np0005626463.localdomain sshd[84977]: Connection closed by invalid user guest 185.156.73.233 port 43858 [preauth]
Feb 23 08:24:53 np0005626463.localdomain sudo[85057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:24:53 np0005626463.localdomain sudo[85057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:24:53 np0005626463.localdomain sudo[85057]: pam_unix(sudo:session): session closed for user root
Feb 23 08:24:53 np0005626463.localdomain sudo[85072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:24:53 np0005626463.localdomain sudo[85072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:24:53 np0005626463.localdomain sudo[85072]: pam_unix(sudo:session): session closed for user root
Feb 23 08:24:54 np0005626463.localdomain sudo[85119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:24:54 np0005626463.localdomain sudo[85119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:24:54 np0005626463.localdomain sudo[85119]: pam_unix(sudo:session): session closed for user root
Feb 23 08:25:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:25:02 np0005626463.localdomain systemd[1]: tmp-crun.EeqVq1.mount: Deactivated successfully.
Feb 23 08:25:02 np0005626463.localdomain podman[85134]: 2026-02-23 08:25:02.936376231 +0000 UTC m=+0.106693343 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Feb 23 08:25:02 np0005626463.localdomain podman[85134]: 2026-02-23 08:25:02.949276885 +0000 UTC m=+0.119594027 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:25:02 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:25:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:25:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:25:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:25:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:25:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:25:04 np0005626463.localdomain podman[85158]: 2026-02-23 08:25:04.932948768 +0000 UTC m=+0.087572617 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:25:04 np0005626463.localdomain podman[85157]: 2026-02-23 08:25:04.914814334 +0000 UTC m=+0.078557221 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, com.redhat.component=openstack-cron-container, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:25:04 np0005626463.localdomain podman[85155]: 2026-02-23 08:25:04.991629511 +0000 UTC m=+0.158844605 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Feb 23 08:25:05 np0005626463.localdomain podman[85157]: 2026-02-23 08:25:05.00043252 +0000 UTC m=+0.164175377 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain podman[85156]: 2026-02-23 08:25:05.025040763 +0000 UTC m=+0.189765191 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, distribution-scope=public)
Feb 23 08:25:05 np0005626463.localdomain podman[85155]: 2026-02-23 08:25:05.073443982 +0000 UTC m=+0.240659076 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Feb 23 08:25:05 np0005626463.localdomain podman[85156]: 2026-02-23 08:25:05.081546529 +0000 UTC m=+0.246270997 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain podman[85158]: 2026-02-23 08:25:05.09397874 +0000 UTC m=+0.248602629 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:25:05 np0005626463.localdomain podman[85154]: 2026-02-23 08:25:05.093089172 +0000 UTC m=+0.260124210 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z)
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain podman[85154]: 2026-02-23 08:25:05.17349888 +0000 UTC m=+0.340533958 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3)
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: tmp-crun.eW6XgF.mount: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:25:05 np0005626463.localdomain recover_tripleo_nova_virtqemud[85270]: 61982
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:25:05 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:25:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:25:10 np0005626463.localdomain systemd[1]: tmp-crun.KY5cJo.mount: Deactivated successfully.
Feb 23 08:25:10 np0005626463.localdomain podman[85271]: 2026-02-23 08:25:10.917093253 +0000 UTC m=+0.088644069 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:25:11 np0005626463.localdomain podman[85271]: 2026-02-23 08:25:11.300321365 +0000 UTC m=+0.471872241 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:25:11 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:25:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:25:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:25:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:25:12 np0005626463.localdomain systemd[1]: tmp-crun.xVzjyM.mount: Deactivated successfully.
Feb 23 08:25:12 np0005626463.localdomain podman[85296]: 2026-02-23 08:25:12.930485185 +0000 UTC m=+0.091447256 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64)
Feb 23 08:25:12 np0005626463.localdomain podman[85295]: 2026-02-23 08:25:12.976272294 +0000 UTC m=+0.139310378 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 23 08:25:13 np0005626463.localdomain podman[85295]: 2026-02-23 08:25:13.025341864 +0000 UTC m=+0.188379958 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:25:13 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:25:13 np0005626463.localdomain podman[85294]: 2026-02-23 08:25:13.029521912 +0000 UTC m=+0.194754373 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public)
Feb 23 08:25:13 np0005626463.localdomain podman[85294]: 2026-02-23 08:25:13.113760637 +0000 UTC m=+0.278993108 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:25:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:25:13 np0005626463.localdomain podman[85296]: 2026-02-23 08:25:13.136311146 +0000 UTC m=+0.297273227 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com)
Feb 23 08:25:13 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:25:15 np0005626463.localdomain sshd[85368]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:25:16 np0005626463.localdomain sshd[85368]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:25:19 np0005626463.localdomain sshd[85370]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:25:19 np0005626463.localdomain sshd[85370]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:25:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:25:33 np0005626463.localdomain podman[85418]: 2026-02-23 08:25:33.916150491 +0000 UTC m=+0.089145766 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1)
Feb 23 08:25:33 np0005626463.localdomain podman[85418]: 2026-02-23 08:25:33.932271584 +0000 UTC m=+0.105266859 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 23 08:25:33 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: tmp-crun.hcSaya.mount: Deactivated successfully.
Feb 23 08:25:35 np0005626463.localdomain podman[85439]: 2026-02-23 08:25:35.931531365 +0000 UTC m=+0.098518123 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 23 08:25:35 np0005626463.localdomain podman[85439]: 2026-02-23 08:25:35.963730418 +0000 UTC m=+0.130717136 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13)
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: tmp-crun.uHWQP4.mount: Deactivated successfully.
Feb 23 08:25:35 np0005626463.localdomain podman[85448]: 2026-02-23 08:25:35.988777564 +0000 UTC m=+0.144443785 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 23 08:25:35 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:25:36 np0005626463.localdomain podman[85448]: 2026-02-23 08:25:36.021269847 +0000 UTC m=+0.176936088 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 23 08:25:36 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:25:36 np0005626463.localdomain podman[85440]: 2026-02-23 08:25:36.045896989 +0000 UTC m=+0.210193925 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi)
Feb 23 08:25:36 np0005626463.localdomain podman[85440]: 2026-02-23 08:25:36.084506349 +0000 UTC m=+0.248803295 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:25:36 np0005626463.localdomain podman[85438]: 2026-02-23 08:25:36.096849987 +0000 UTC m=+0.266394212 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, release=1766032510, config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:25:36 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:25:36 np0005626463.localdomain podman[85441]: 2026-02-23 08:25:36.063708534 +0000 UTC m=+0.222946355 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:25:36 np0005626463.localdomain podman[85438]: 2026-02-23 08:25:36.130266058 +0000 UTC m=+0.299810263 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1)
Feb 23 08:25:36 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:25:36 np0005626463.localdomain podman[85441]: 2026-02-23 08:25:36.147196366 +0000 UTC m=+0.306434147 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:25:36 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:25:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:25:41 np0005626463.localdomain systemd[1]: tmp-crun.m3jrMO.mount: Deactivated successfully.
Feb 23 08:25:41 np0005626463.localdomain podman[85554]: 2026-02-23 08:25:41.921178797 +0000 UTC m=+0.096320745 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4)
Feb 23 08:25:42 np0005626463.localdomain podman[85554]: 2026-02-23 08:25:42.358245825 +0000 UTC m=+0.533387753 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:25:42 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:25:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:25:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:25:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:25:43 np0005626463.localdomain systemd[1]: tmp-crun.Hcmpjm.mount: Deactivated successfully.
Feb 23 08:25:43 np0005626463.localdomain podman[85577]: 2026-02-23 08:25:43.936784987 +0000 UTC m=+0.098031097 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:25:43 np0005626463.localdomain podman[85577]: 2026-02-23 08:25:43.963321138 +0000 UTC m=+0.124567278 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, distribution-scope=public, architecture=x86_64)
Feb 23 08:25:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:25:43 np0005626463.localdomain podman[85579]: 2026-02-23 08:25:43.984158715 +0000 UTC m=+0.140655350 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:25:44 np0005626463.localdomain podman[85578]: 2026-02-23 08:25:44.033927186 +0000 UTC m=+0.192876126 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 23 08:25:44 np0005626463.localdomain podman[85578]: 2026-02-23 08:25:44.106313458 +0000 UTC m=+0.265262438 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:25:44 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:25:44 np0005626463.localdomain podman[85579]: 2026-02-23 08:25:44.189368436 +0000 UTC m=+0.345865041 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:25:44 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:25:54 np0005626463.localdomain sudo[85655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:25:54 np0005626463.localdomain sudo[85655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:25:54 np0005626463.localdomain sudo[85655]: pam_unix(sudo:session): session closed for user root
Feb 23 08:25:54 np0005626463.localdomain sudo[85670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:25:54 np0005626463.localdomain sudo[85670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:25:55 np0005626463.localdomain sudo[85670]: pam_unix(sudo:session): session closed for user root
Feb 23 08:25:55 np0005626463.localdomain sudo[85717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:25:55 np0005626463.localdomain sudo[85717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:25:55 np0005626463.localdomain sudo[85717]: pam_unix(sudo:session): session closed for user root
Feb 23 08:25:59 np0005626463.localdomain sshd[85732]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:25:59 np0005626463.localdomain sshd[85732]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:25:59 np0005626463.localdomain sshd[85734]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:26:01 np0005626463.localdomain sshd[85734]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:26:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:26:04 np0005626463.localdomain podman[85736]: 2026-02-23 08:26:04.929514138 +0000 UTC m=+0.100764620 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:26:04 np0005626463.localdomain podman[85736]: 2026-02-23 08:26:04.945383946 +0000 UTC m=+0.116634438 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:26:04 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: tmp-crun.NUL0Xd.mount: Deactivated successfully.
Feb 23 08:26:06 np0005626463.localdomain podman[85756]: 2026-02-23 08:26:06.916546527 +0000 UTC m=+0.088963866 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: tmp-crun.o027uw.mount: Deactivated successfully.
Feb 23 08:26:06 np0005626463.localdomain podman[85759]: 2026-02-23 08:26:06.974786559 +0000 UTC m=+0.137503825 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20260112.1)
Feb 23 08:26:06 np0005626463.localdomain podman[85756]: 2026-02-23 08:26:06.981431298 +0000 UTC m=+0.153848587 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, tcib_managed=true, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:26:06 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:26:07 np0005626463.localdomain podman[85758]: 2026-02-23 08:26:07.034651488 +0000 UTC m=+0.200127367 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 23 08:26:07 np0005626463.localdomain podman[85757]: 2026-02-23 08:26:06.935814187 +0000 UTC m=+0.102657078 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z)
Feb 23 08:26:07 np0005626463.localdomain podman[85765]: 2026-02-23 08:26:06.957692955 +0000 UTC m=+0.115386540 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 23 08:26:07 np0005626463.localdomain podman[85758]: 2026-02-23 08:26:07.067235828 +0000 UTC m=+0.232711657 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:26:07 np0005626463.localdomain podman[85757]: 2026-02-23 08:26:07.067448644 +0000 UTC m=+0.234291545 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:26:07 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:26:07 np0005626463.localdomain podman[85759]: 2026-02-23 08:26:07.087532778 +0000 UTC m=+0.250250034 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.)
Feb 23 08:26:07 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:26:07 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:26:07 np0005626463.localdomain podman[85765]: 2026-02-23 08:26:07.141966645 +0000 UTC m=+0.299660310 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:26:07 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:26:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:26:12 np0005626463.localdomain podman[85873]: 2026-02-23 08:26:12.883578946 +0000 UTC m=+0.062794138 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:26:13 np0005626463.localdomain podman[85873]: 2026-02-23 08:26:13.305366468 +0000 UTC m=+0.484581720 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:26:13 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:26:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:26:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:26:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:26:14 np0005626463.localdomain systemd[1]: tmp-crun.idEAVE.mount: Deactivated successfully.
Feb 23 08:26:14 np0005626463.localdomain podman[85896]: 2026-02-23 08:26:14.92460945 +0000 UTC m=+0.097025618 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, distribution-scope=public, release=1766032510, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:26:14 np0005626463.localdomain podman[85896]: 2026-02-23 08:26:14.95622541 +0000 UTC m=+0.128641558 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=)
Feb 23 08:26:14 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:26:14 np0005626463.localdomain podman[85898]: 2026-02-23 08:26:14.97650498 +0000 UTC m=+0.141473644 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1)
Feb 23 08:26:15 np0005626463.localdomain podman[85897]: 2026-02-23 08:26:15.025280687 +0000 UTC m=+0.193510009 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:26:15 np0005626463.localdomain podman[85897]: 2026-02-23 08:26:15.07628891 +0000 UTC m=+0.244518282 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1)
Feb 23 08:26:15 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:26:15 np0005626463.localdomain podman[85898]: 2026-02-23 08:26:15.219477365 +0000 UTC m=+0.384446019 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:26:15 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:26:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:26:35 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:26:35 np0005626463.localdomain recover_tripleo_nova_virtqemud[86018]: 61982
Feb 23 08:26:35 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:26:35 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:26:35 np0005626463.localdomain podman[86015]: 2026-02-23 08:26:35.929185792 +0000 UTC m=+0.094962467 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13)
Feb 23 08:26:35 np0005626463.localdomain podman[86015]: 2026-02-23 08:26:35.947598045 +0000 UTC m=+0.113374710 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, tcib_managed=true, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5)
Feb 23 08:26:35 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:26:37 np0005626463.localdomain podman[86039]: 2026-02-23 08:26:37.882662822 +0000 UTC m=+0.055053287 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, distribution-scope=public)
Feb 23 08:26:37 np0005626463.localdomain podman[86038]: 2026-02-23 08:26:37.927695246 +0000 UTC m=+0.097012448 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 23 08:26:37 np0005626463.localdomain podman[86039]: 2026-02-23 08:26:37.951284876 +0000 UTC m=+0.123675381 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510)
Feb 23 08:26:37 np0005626463.localdomain podman[86037]: 2026-02-23 08:26:37.96906599 +0000 UTC m=+0.141299650 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:26:37 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:26:37 np0005626463.localdomain podman[86041]: 2026-02-23 08:26:37.997993359 +0000 UTC m=+0.167777525 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:26:38 np0005626463.localdomain podman[86040]: 2026-02-23 08:26:37.951076729 +0000 UTC m=+0.121604837 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:26:38 np0005626463.localdomain podman[86038]: 2026-02-23 08:26:38.017092123 +0000 UTC m=+0.186409305 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container)
Feb 23 08:26:38 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:26:38 np0005626463.localdomain podman[86040]: 2026-02-23 08:26:38.035380644 +0000 UTC m=+0.205908802 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z)
Feb 23 08:26:38 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:26:38 np0005626463.localdomain podman[86037]: 2026-02-23 08:26:38.058654873 +0000 UTC m=+0.230888573 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com)
Feb 23 08:26:38 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:26:38 np0005626463.localdomain podman[86041]: 2026-02-23 08:26:38.076476279 +0000 UTC m=+0.246260465 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:26:38 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:26:38 np0005626463.localdomain systemd[1]: tmp-crun.Rqs7lX.mount: Deactivated successfully.
Feb 23 08:26:40 np0005626463.localdomain sshd[86153]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:26:41 np0005626463.localdomain sshd[86153]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:26:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:26:43 np0005626463.localdomain podman[86155]: 2026-02-23 08:26:43.899166468 +0000 UTC m=+0.070387017 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 23 08:26:44 np0005626463.localdomain sshd[86175]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:26:44 np0005626463.localdomain podman[86155]: 2026-02-23 08:26:44.29237657 +0000 UTC m=+0.463597129 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com)
Feb 23 08:26:44 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:26:44 np0005626463.localdomain sshd[86175]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:26:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:26:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:26:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:26:45 np0005626463.localdomain podman[86180]: 2026-02-23 08:26:45.913891401 +0000 UTC m=+0.080311416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible)
Feb 23 08:26:45 np0005626463.localdomain systemd[1]: tmp-crun.2aanf6.mount: Deactivated successfully.
Feb 23 08:26:45 np0005626463.localdomain podman[86178]: 2026-02-23 08:26:45.977781722 +0000 UTC m=+0.151302390 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:26:46 np0005626463.localdomain podman[86178]: 2026-02-23 08:26:46.028449055 +0000 UTC m=+0.201969743 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true)
Feb 23 08:26:46 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:26:46 np0005626463.localdomain podman[86179]: 2026-02-23 08:26:46.115504913 +0000 UTC m=+0.284661640 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13)
Feb 23 08:26:46 np0005626463.localdomain podman[86180]: 2026-02-23 08:26:46.142326109 +0000 UTC m=+0.308746084 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:26:46 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:26:46 np0005626463.localdomain podman[86179]: 2026-02-23 08:26:46.187649502 +0000 UTC m=+0.356806249 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 23 08:26:46 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:26:46 np0005626463.localdomain systemd[1]: tmp-crun.GRHFdc.mount: Deactivated successfully.
Feb 23 08:26:56 np0005626463.localdomain sudo[86251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:26:56 np0005626463.localdomain sudo[86251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:26:56 np0005626463.localdomain sudo[86251]: pam_unix(sudo:session): session closed for user root
Feb 23 08:26:56 np0005626463.localdomain sudo[86266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:26:56 np0005626463.localdomain sudo[86266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:26:56 np0005626463.localdomain sudo[86266]: pam_unix(sudo:session): session closed for user root
Feb 23 08:26:57 np0005626463.localdomain sudo[86313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:26:57 np0005626463.localdomain sudo[86313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:26:57 np0005626463.localdomain sudo[86313]: pam_unix(sudo:session): session closed for user root
Feb 23 08:27:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:27:06 np0005626463.localdomain systemd[1]: tmp-crun.tVOK4k.mount: Deactivated successfully.
Feb 23 08:27:06 np0005626463.localdomain podman[86328]: 2026-02-23 08:27:06.936941513 +0000 UTC m=+0.107714793 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:27:06 np0005626463.localdomain podman[86328]: 2026-02-23 08:27:06.948123346 +0000 UTC m=+0.118896656 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible)
Feb 23 08:27:06 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: tmp-crun.J6cW6y.mount: Deactivated successfully.
Feb 23 08:27:08 np0005626463.localdomain podman[86350]: 2026-02-23 08:27:08.934290808 +0000 UTC m=+0.097243199 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true)
Feb 23 08:27:08 np0005626463.localdomain systemd[1]: tmp-crun.OK3cbs.mount: Deactivated successfully.
Feb 23 08:27:09 np0005626463.localdomain podman[86350]: 2026-02-23 08:27:09.013684369 +0000 UTC m=+0.176636820 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:27:09 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:27:09 np0005626463.localdomain podman[86351]: 2026-02-23 08:27:08.994381319 +0000 UTC m=+0.152971949 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 23 08:27:09 np0005626463.localdomain podman[86349]: 2026-02-23 08:27:09.016537089 +0000 UTC m=+0.181956250 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13)
Feb 23 08:27:09 np0005626463.localdomain podman[86357]: 2026-02-23 08:27:09.023640957 +0000 UTC m=+0.180166989 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute)
Feb 23 08:27:09 np0005626463.localdomain podman[86351]: 2026-02-23 08:27:09.079208631 +0000 UTC m=+0.237799281 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:27:09 np0005626463.localdomain podman[86348]: 2026-02-23 08:27:09.086739612 +0000 UTC m=+0.256221867 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z)
Feb 23 08:27:09 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:27:09 np0005626463.localdomain podman[86349]: 2026-02-23 08:27:09.095657881 +0000 UTC m=+0.261077062 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510)
Feb 23 08:27:09 np0005626463.localdomain podman[86357]: 2026-02-23 08:27:09.103856831 +0000 UTC m=+0.260382823 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:27:09 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:27:09 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:27:09 np0005626463.localdomain podman[86348]: 2026-02-23 08:27:09.147319005 +0000 UTC m=+0.316801250 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:27:09 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:27:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:27:14 np0005626463.localdomain podman[86469]: 2026-02-23 08:27:14.904919857 +0000 UTC m=+0.080486542 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 23 08:27:15 np0005626463.localdomain podman[86469]: 2026-02-23 08:27:15.314202622 +0000 UTC m=+0.489769307 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:27:15 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:27:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:27:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:27:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:27:16 np0005626463.localdomain systemd[1]: tmp-crun.TdFd3k.mount: Deactivated successfully.
Feb 23 08:27:16 np0005626463.localdomain podman[86493]: 2026-02-23 08:27:16.903377143 +0000 UTC m=+0.080046889 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public)
Feb 23 08:27:16 np0005626463.localdomain podman[86492]: 2026-02-23 08:27:16.965991173 +0000 UTC m=+0.144208443 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:27:17 np0005626463.localdomain podman[86494]: 2026-02-23 08:27:17.012108684 +0000 UTC m=+0.185757476 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.buildah.version=1.41.5, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:27:17 np0005626463.localdomain podman[86492]: 2026-02-23 08:27:17.01734954 +0000 UTC m=+0.195566820 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:27:17 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:27:17 np0005626463.localdomain podman[86493]: 2026-02-23 08:27:17.035645781 +0000 UTC m=+0.212315517 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:27:17 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:27:17 np0005626463.localdomain podman[86494]: 2026-02-23 08:27:17.230296745 +0000 UTC m=+0.403945597 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.13, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:27:17 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:27:17 np0005626463.localdomain systemd[1]: tmp-crun.cDjKsh.mount: Deactivated successfully.
Feb 23 08:27:24 np0005626463.localdomain sshd[86568]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:27:24 np0005626463.localdomain sshd[86568]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:27:34 np0005626463.localdomain sshd[86615]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:27:35 np0005626463.localdomain sshd[86615]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:27:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:27:37 np0005626463.localdomain systemd[1]: tmp-crun.mjn1xo.mount: Deactivated successfully.
Feb 23 08:27:37 np0005626463.localdomain podman[86617]: 2026-02-23 08:27:37.918659059 +0000 UTC m=+0.090841031 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:27:37 np0005626463.localdomain podman[86617]: 2026-02-23 08:27:37.935618494 +0000 UTC m=+0.107800476 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:27:37 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: tmp-crun.3DH3FI.mount: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain podman[86639]: 2026-02-23 08:27:40.342218952 +0000 UTC m=+0.218648225 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:27:40 np0005626463.localdomain podman[86638]: 2026-02-23 08:27:40.364820384 +0000 UTC m=+0.239928860 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13)
Feb 23 08:27:40 np0005626463.localdomain podman[86637]: 2026-02-23 08:27:40.830140766 +0000 UTC m=+0.709677875 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:27:40 np0005626463.localdomain podman[86643]: 2026-02-23 08:27:40.379417992 +0000 UTC m=+0.251167513 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true)
Feb 23 08:27:40 np0005626463.localdomain podman[86637]: 2026-02-23 08:27:40.838210312 +0000 UTC m=+0.717747391 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, container_name=iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain podman[86638]: 2026-02-23 08:27:40.848659685 +0000 UTC m=+0.723768191 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:27:40 np0005626463.localdomain podman[86639]: 2026-02-23 08:27:40.865020103 +0000 UTC m=+0.741449386 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain podman[86643]: 2026-02-23 08:27:40.867214803 +0000 UTC m=+0.738964324 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain podman[86640]: 2026-02-23 08:27:40.431920401 +0000 UTC m=+0.302473449 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:27:40 np0005626463.localdomain podman[86640]: 2026-02-23 08:27:40.936374708 +0000 UTC m=+0.806927766 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5)
Feb 23 08:27:40 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:27:41 np0005626463.localdomain systemd[1]: tmp-crun.Dhd2Q6.mount: Deactivated successfully.
Feb 23 08:27:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:27:45 np0005626463.localdomain podman[86754]: 2026-02-23 08:27:45.918742408 +0000 UTC m=+0.087561981 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:27:46 np0005626463.localdomain podman[86754]: 2026-02-23 08:27:46.307356064 +0000 UTC m=+0.476175557 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:27:46 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:27:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:27:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:27:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:27:47 np0005626463.localdomain systemd[1]: tmp-crun.qGTSAo.mount: Deactivated successfully.
Feb 23 08:27:47 np0005626463.localdomain podman[86778]: 2026-02-23 08:27:47.917198284 +0000 UTC m=+0.089976067 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:27:47 np0005626463.localdomain podman[86777]: 2026-02-23 08:27:47.962023847 +0000 UTC m=+0.137328991 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=)
Feb 23 08:27:47 np0005626463.localdomain podman[86778]: 2026-02-23 08:27:47.978308303 +0000 UTC m=+0.151086056 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=)
Feb 23 08:27:47 np0005626463.localdomain podman[86777]: 2026-02-23 08:27:47.990391231 +0000 UTC m=+0.165696365 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, architecture=x86_64)
Feb 23 08:27:47 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:27:48 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:27:48 np0005626463.localdomain podman[86779]: 2026-02-23 08:27:48.068026672 +0000 UTC m=+0.236903506 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com)
Feb 23 08:27:48 np0005626463.localdomain podman[86779]: 2026-02-23 08:27:48.274261899 +0000 UTC m=+0.443138733 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, release=1766032510)
Feb 23 08:27:48 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:27:58 np0005626463.localdomain sudo[86853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:27:58 np0005626463.localdomain sudo[86853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:27:58 np0005626463.localdomain sudo[86853]: pam_unix(sudo:session): session closed for user root
Feb 23 08:27:58 np0005626463.localdomain sudo[86868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:27:58 np0005626463.localdomain sudo[86868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:27:58 np0005626463.localdomain sudo[86868]: pam_unix(sudo:session): session closed for user root
Feb 23 08:28:02 np0005626463.localdomain sudo[86915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:28:02 np0005626463.localdomain sudo[86915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:28:02 np0005626463.localdomain sudo[86915]: pam_unix(sudo:session): session closed for user root
Feb 23 08:28:07 np0005626463.localdomain sshd[86930]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:28:07 np0005626463.localdomain sshd[86930]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:28:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:28:08 np0005626463.localdomain systemd[1]: tmp-crun.QRUx4A.mount: Deactivated successfully.
Feb 23 08:28:08 np0005626463.localdomain podman[86932]: 2026-02-23 08:28:08.935222043 +0000 UTC m=+0.108693977 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, container_name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 23 08:28:08 np0005626463.localdomain podman[86932]: 2026-02-23 08:28:08.95111733 +0000 UTC m=+0.124589234 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:28:08 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:28:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:28:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:28:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:28:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:28:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:28:11 np0005626463.localdomain systemd[83969]: Created slice User Background Tasks Slice.
Feb 23 08:28:11 np0005626463.localdomain podman[86954]: 2026-02-23 08:28:11.948028535 +0000 UTC m=+0.102181461 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Feb 23 08:28:11 np0005626463.localdomain systemd[83969]: Starting Cleanup of User's Temporary Files and Directories...
Feb 23 08:28:11 np0005626463.localdomain systemd[83969]: Finished Cleanup of User's Temporary Files and Directories.
Feb 23 08:28:11 np0005626463.localdomain podman[86954]: 2026-02-23 08:28:11.993298905 +0000 UTC m=+0.147451781 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:28:11 np0005626463.localdomain podman[86955]: 2026-02-23 08:28:11.993524151 +0000 UTC m=+0.146268165 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:28:12 np0005626463.localdomain podman[86952]: 2026-02-23 08:28:12.09165011 +0000 UTC m=+0.257100876 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:28:12 np0005626463.localdomain podman[86955]: 2026-02-23 08:28:12.121920709 +0000 UTC m=+0.274664653 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:28:12 np0005626463.localdomain podman[86952]: 2026-02-23 08:28:12.131982962 +0000 UTC m=+0.297433728 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=)
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:28:12 np0005626463.localdomain podman[86961]: 2026-02-23 08:28:12.190751578 +0000 UTC m=+0.342499512 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:28:12 np0005626463.localdomain podman[86953]: 2026-02-23 08:28:12.104923449 +0000 UTC m=+0.266131187 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com)
Feb 23 08:28:12 np0005626463.localdomain podman[86961]: 2026-02-23 08:28:12.222849062 +0000 UTC m=+0.374596996 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:28:12 np0005626463.localdomain podman[86953]: 2026-02-23 08:28:12.240257654 +0000 UTC m=+0.401465362 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container)
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:28:12 np0005626463.localdomain systemd[1]: tmp-crun.i2Ju1f.mount: Deactivated successfully.
Feb 23 08:28:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:28:16 np0005626463.localdomain podman[87064]: 2026-02-23 08:28:16.900483756 +0000 UTC m=+0.073939252 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_migration_target, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, batch=17.1_20260112.1)
Feb 23 08:28:17 np0005626463.localdomain podman[87064]: 2026-02-23 08:28:17.258612366 +0000 UTC m=+0.432067902 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:28:17 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:28:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:28:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:28:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:28:18 np0005626463.localdomain podman[87090]: 2026-02-23 08:28:18.906823698 +0000 UTC m=+0.078053326 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1)
Feb 23 08:28:18 np0005626463.localdomain systemd[1]: tmp-crun.Bep8KR.mount: Deactivated successfully.
Feb 23 08:28:18 np0005626463.localdomain podman[87090]: 2026-02-23 08:28:18.970718178 +0000 UTC m=+0.141947836 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:28:18 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:28:19 np0005626463.localdomain podman[87089]: 2026-02-23 08:28:18.976270655 +0000 UTC m=+0.149516363 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:28:19 np0005626463.localdomain podman[87089]: 2026-02-23 08:28:19.056793804 +0000 UTC m=+0.230039522 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510, vcs-type=git)
Feb 23 08:28:19 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:28:19 np0005626463.localdomain podman[87091]: 2026-02-23 08:28:19.120669604 +0000 UTC m=+0.291922503 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_id=tripleo_step1, url=https://www.redhat.com)
Feb 23 08:28:19 np0005626463.localdomain podman[87091]: 2026-02-23 08:28:19.324478526 +0000 UTC m=+0.495731445 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:28:19 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:28:23 np0005626463.localdomain sshd[87165]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:28:23 np0005626463.localdomain sshd[87165]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:28:23 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:28:23 np0005626463.localdomain recover_tripleo_nova_virtqemud[87168]: 61982
Feb 23 08:28:23 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:28:23 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:28:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:28:40 np0005626463.localdomain systemd[1]: tmp-crun.7ZdWkc.mount: Deactivated successfully.
Feb 23 08:28:40 np0005626463.localdomain podman[87214]: 2026-02-23 08:28:40.787920023 +0000 UTC m=+0.058351025 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:28:40 np0005626463.localdomain podman[87214]: 2026-02-23 08:28:40.800118419 +0000 UTC m=+0.070549431 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public)
Feb 23 08:28:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:28:43 np0005626463.localdomain podman[87235]: 2026-02-23 08:28:43.32293923 +0000 UTC m=+0.069610243 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 23 08:28:43 np0005626463.localdomain podman[87235]: 2026-02-23 08:28:43.342022813 +0000 UTC m=+0.088693816 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 23 08:28:43 np0005626463.localdomain podman[87234]: 2026-02-23 08:28:43.303936578 +0000 UTC m=+0.056606611 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1)
Feb 23 08:28:43 np0005626463.localdomain podman[87236]: 2026-02-23 08:28:43.361834458 +0000 UTC m=+0.110617095 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:15Z)
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:28:43 np0005626463.localdomain podman[87236]: 2026-02-23 08:28:43.396201051 +0000 UTC m=+0.144983738 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:28:43 np0005626463.localdomain podman[87234]: 2026-02-23 08:28:43.436397598 +0000 UTC m=+0.189067641 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:28:43 np0005626463.localdomain podman[87233]: 2026-02-23 08:28:43.4700895 +0000 UTC m=+0.225055452 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:28:43 np0005626463.localdomain podman[87233]: 2026-02-23 08:28:43.503892717 +0000 UTC m=+0.258858619 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, container_name=iscsid, vendor=Red Hat, Inc.)
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:28:43 np0005626463.localdomain podman[87242]: 2026-02-23 08:28:43.520087803 +0000 UTC m=+0.267308182 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:28:43 np0005626463.localdomain podman[87242]: 2026-02-23 08:28:43.552160557 +0000 UTC m=+0.299380876 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute)
Feb 23 08:28:43 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:28:44 np0005626463.localdomain systemd[1]: tmp-crun.rCaPcr.mount: Deactivated successfully.
Feb 23 08:28:47 np0005626463.localdomain sshd[87343]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:28:47 np0005626463.localdomain sshd[87343]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:28:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:28:47 np0005626463.localdomain systemd[1]: tmp-crun.aTjdAp.mount: Deactivated successfully.
Feb 23 08:28:47 np0005626463.localdomain podman[87345]: 2026-02-23 08:28:47.831166333 +0000 UTC m=+0.090490559 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:28:48 np0005626463.localdomain podman[87345]: 2026-02-23 08:28:48.222242464 +0000 UTC m=+0.481566720 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.13, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 23 08:28:48 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:28:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:28:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:28:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:28:49 np0005626463.localdomain systemd[1]: tmp-crun.JZSdKG.mount: Deactivated successfully.
Feb 23 08:28:49 np0005626463.localdomain podman[87368]: 2026-02-23 08:28:49.922377246 +0000 UTC m=+0.097539092 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:28:49 np0005626463.localdomain podman[87368]: 2026-02-23 08:28:49.97644214 +0000 UTC m=+0.151603916 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:28:49 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:28:50 np0005626463.localdomain podman[87369]: 2026-02-23 08:28:49.977276895 +0000 UTC m=+0.147644176 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:28:50 np0005626463.localdomain podman[87369]: 2026-02-23 08:28:50.058704332 +0000 UTC m=+0.229071653 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:28:50 np0005626463.localdomain podman[87370]: 2026-02-23 08:28:50.069690502 +0000 UTC m=+0.240088584 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 23 08:28:50 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:28:50 np0005626463.localdomain podman[87370]: 2026-02-23 08:28:50.272473775 +0000 UTC m=+0.442871857 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, batch=17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Feb 23 08:28:50 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:29:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:29:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 622 writes, 2372 keys, 622 commit groups, 1.0 writes per commit group, ingest: 2.93 MB, 0.00 MB/s
                                                          Interval WAL: 622 writes, 215 syncs, 2.89 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:29:03 np0005626463.localdomain sudo[87443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:29:03 np0005626463.localdomain sudo[87443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:29:03 np0005626463.localdomain sudo[87443]: pam_unix(sudo:session): session closed for user root
Feb 23 08:29:03 np0005626463.localdomain sudo[87458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:29:03 np0005626463.localdomain sudo[87458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:29:04 np0005626463.localdomain sudo[87458]: pam_unix(sudo:session): session closed for user root
Feb 23 08:29:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:29:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 408 writes, 1665 keys, 408 commit groups, 1.0 writes per commit group, ingest: 2.08 MB, 0.00 MB/s
                                                          Interval WAL: 408 writes, 144 syncs, 2.83 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:29:08 np0005626463.localdomain sudo[87506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:29:08 np0005626463.localdomain sudo[87506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:29:08 np0005626463.localdomain sudo[87506]: pam_unix(sudo:session): session closed for user root
Feb 23 08:29:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:29:11 np0005626463.localdomain podman[87521]: 2026-02-23 08:29:11.891490515 +0000 UTC m=+0.062310873 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64)
Feb 23 08:29:11 np0005626463.localdomain podman[87521]: 2026-02-23 08:29:11.900849956 +0000 UTC m=+0.071670284 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=)
Feb 23 08:29:11 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:29:13 np0005626463.localdomain podman[87542]: 2026-02-23 08:29:13.925039477 +0000 UTC m=+0.098652956 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z)
Feb 23 08:29:13 np0005626463.localdomain podman[87541]: 2026-02-23 08:29:13.972442124 +0000 UTC m=+0.145525746 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, version=17.1.13)
Feb 23 08:29:13 np0005626463.localdomain podman[87542]: 2026-02-23 08:29:13.988262693 +0000 UTC m=+0.161876122 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64)
Feb 23 08:29:13 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:29:14 np0005626463.localdomain podman[87543]: 2026-02-23 08:29:14.027685577 +0000 UTC m=+0.199733657 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=)
Feb 23 08:29:14 np0005626463.localdomain podman[87541]: 2026-02-23 08:29:14.062341302 +0000 UTC m=+0.235424934 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:29:14 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:29:14 np0005626463.localdomain podman[87545]: 2026-02-23 08:29:14.074129884 +0000 UTC m=+0.241472225 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:29:14 np0005626463.localdomain podman[87543]: 2026-02-23 08:29:14.080363911 +0000 UTC m=+0.252412041 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:29:14 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:29:14 np0005626463.localdomain podman[87544]: 2026-02-23 08:29:14.123459372 +0000 UTC m=+0.290895686 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.13, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron)
Feb 23 08:29:14 np0005626463.localdomain podman[87544]: 2026-02-23 08:29:14.133246101 +0000 UTC m=+0.300682455 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:29:14 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:29:14 np0005626463.localdomain podman[87545]: 2026-02-23 08:29:14.155661679 +0000 UTC m=+0.323004000 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:29:14 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:29:14 np0005626463.localdomain sshd[87659]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:29:15 np0005626463.localdomain sshd[87659]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:29:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:29:18 np0005626463.localdomain podman[87661]: 2026-02-23 08:29:18.915044806 +0000 UTC m=+0.087039859 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:29:19 np0005626463.localdomain podman[87661]: 2026-02-23 08:29:19.332323181 +0000 UTC m=+0.504318244 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4)
Feb 23 08:29:19 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:29:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:29:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:29:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:29:20 np0005626463.localdomain systemd[1]: tmp-crun.X3irS5.mount: Deactivated successfully.
Feb 23 08:29:20 np0005626463.localdomain podman[87684]: 2026-02-23 08:29:20.957436463 +0000 UTC m=+0.134870929 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:29:21 np0005626463.localdomain podman[87686]: 2026-02-23 08:29:21.015447645 +0000 UTC m=+0.186688115 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, release=1766032510, config_id=tripleo_step1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 23 08:29:21 np0005626463.localdomain podman[87685]: 2026-02-23 08:29:20.981761252 +0000 UTC m=+0.155112169 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:29:21 np0005626463.localdomain podman[87685]: 2026-02-23 08:29:21.067688144 +0000 UTC m=+0.241039021 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git)
Feb 23 08:29:21 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:29:21 np0005626463.localdomain podman[87684]: 2026-02-23 08:29:21.120205143 +0000 UTC m=+0.297639609 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:29:21 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:29:21 np0005626463.localdomain podman[87686]: 2026-02-23 08:29:21.245356415 +0000 UTC m=+0.416596925 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:29:21 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:29:21 np0005626463.localdomain systemd[1]: tmp-crun.YBg8Ml.mount: Deactivated successfully.
Feb 23 08:29:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=51706 SEQ=0 ACK=1003347813 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:29:27 np0005626463.localdomain sshd[87761]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:29:28 np0005626463.localdomain sshd[87761]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:29:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:29:42 np0005626463.localdomain systemd[1]: tmp-crun.3YIuMB.mount: Deactivated successfully.
Feb 23 08:29:42 np0005626463.localdomain podman[87810]: 2026-02-23 08:29:42.915917089 +0000 UTC m=+0.087111253 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, tcib_managed=true)
Feb 23 08:29:42 np0005626463.localdomain podman[87810]: 2026-02-23 08:29:42.955284282 +0000 UTC m=+0.126478436 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:29:42 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:29:45 np0005626463.localdomain systemd[1]: tmp-crun.x7ZaTW.mount: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain podman[87831]: 2026-02-23 08:29:45.999964856 +0000 UTC m=+1.173453992 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 23 08:29:46 np0005626463.localdomain podman[87833]: 2026-02-23 08:29:46.050390688 +0000 UTC m=+1.221187340 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:29:46 np0005626463.localdomain podman[87834]: 2026-02-23 08:29:46.090554797 +0000 UTC m=+1.259719707 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_compute)
Feb 23 08:29:46 np0005626463.localdomain podman[87830]: 2026-02-23 08:29:46.098129905 +0000 UTC m=+1.270165925 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:29:46 np0005626463.localdomain podman[87830]: 2026-02-23 08:29:46.108371909 +0000 UTC m=+1.280407919 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64)
Feb 23 08:29:46 np0005626463.localdomain podman[87834]: 2026-02-23 08:29:46.116222567 +0000 UTC m=+1.285387477 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain podman[87831]: 2026-02-23 08:29:46.124463587 +0000 UTC m=+1.297952723 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4)
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain podman[87833]: 2026-02-23 08:29:46.13628148 +0000 UTC m=+1.307078132 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain podman[87832]: 2026-02-23 08:29:46.19011052 +0000 UTC m=+1.363552034 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git)
Feb 23 08:29:46 np0005626463.localdomain podman[87832]: 2026-02-23 08:29:46.219213079 +0000 UTC m=+1.392654613 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true)
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:29:46 np0005626463.localdomain recover_tripleo_nova_virtqemud[87950]: 61982
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:29:46 np0005626463.localdomain systemd[1]: tmp-crun.prKG6S.mount: Deactivated successfully.
Feb 23 08:29:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:29:49 np0005626463.localdomain systemd[1]: tmp-crun.gGeak4.mount: Deactivated successfully.
Feb 23 08:29:49 np0005626463.localdomain podman[87951]: 2026-02-23 08:29:49.928068244 +0000 UTC m=+0.101451725 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:29:50 np0005626463.localdomain podman[87951]: 2026-02-23 08:29:50.326302588 +0000 UTC m=+0.499686019 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1)
Feb 23 08:29:50 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:29:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:29:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:29:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:29:51 np0005626463.localdomain systemd[1]: tmp-crun.wuozqy.mount: Deactivated successfully.
Feb 23 08:29:51 np0005626463.localdomain podman[87975]: 2026-02-23 08:29:51.91596658 +0000 UTC m=+0.090163327 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, release=1766032510, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:29:51 np0005626463.localdomain systemd[1]: tmp-crun.AWThvj.mount: Deactivated successfully.
Feb 23 08:29:51 np0005626463.localdomain podman[87976]: 2026-02-23 08:29:51.975205071 +0000 UTC m=+0.143956896 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64)
Feb 23 08:29:52 np0005626463.localdomain podman[87976]: 2026-02-23 08:29:52.01633114 +0000 UTC m=+0.185082965 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent)
Feb 23 08:29:52 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:29:52 np0005626463.localdomain podman[87977]: 2026-02-23 08:29:52.03216444 +0000 UTC m=+0.196541897 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, container_name=metrics_qdr, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:29:52 np0005626463.localdomain podman[87975]: 2026-02-23 08:29:52.042990712 +0000 UTC m=+0.217187439 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller)
Feb 23 08:29:52 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:29:52 np0005626463.localdomain podman[87977]: 2026-02-23 08:29:52.232154444 +0000 UTC m=+0.396531931 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1)
Feb 23 08:29:52 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:30:02 np0005626463.localdomain sshd[88053]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:30:03 np0005626463.localdomain sshd[88053]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:30:05 np0005626463.localdomain sshd[88055]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:30:05 np0005626463.localdomain sshd[88055]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:30:08 np0005626463.localdomain sudo[88057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:30:08 np0005626463.localdomain sudo[88057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:30:08 np0005626463.localdomain sudo[88057]: pam_unix(sudo:session): session closed for user root
Feb 23 08:30:08 np0005626463.localdomain sudo[88072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 08:30:08 np0005626463.localdomain sudo[88072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:30:09 np0005626463.localdomain sudo[88072]: pam_unix(sudo:session): session closed for user root
Feb 23 08:30:09 np0005626463.localdomain sudo[88108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:30:09 np0005626463.localdomain sudo[88108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:30:09 np0005626463.localdomain sudo[88108]: pam_unix(sudo:session): session closed for user root
Feb 23 08:30:09 np0005626463.localdomain sudo[88123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:30:09 np0005626463.localdomain sudo[88123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:30:10 np0005626463.localdomain sudo[88123]: pam_unix(sudo:session): session closed for user root
Feb 23 08:30:10 np0005626463.localdomain sudo[88169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:30:10 np0005626463.localdomain sudo[88169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:30:10 np0005626463.localdomain sudo[88169]: pam_unix(sudo:session): session closed for user root
Feb 23 08:30:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:30:13 np0005626463.localdomain podman[88184]: 2026-02-23 08:30:13.925342542 +0000 UTC m=+0.094602658 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com)
Feb 23 08:30:13 np0005626463.localdomain podman[88184]: 2026-02-23 08:30:13.93828463 +0000 UTC m=+0.107544706 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:30:13 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:30:17 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=43100 SEQ=0 ACK=564930235 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:30:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:30:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:30:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:30:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:30:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: tmp-crun.inRsQQ.mount: Deactivated successfully.
Feb 23 08:30:18 np0005626463.localdomain podman[88208]: 2026-02-23 08:30:18.037176902 +0000 UTC m=+0.071780278 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:30:18 np0005626463.localdomain podman[88209]: 2026-02-23 08:30:18.068260223 +0000 UTC m=+0.101357822 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:30:18 np0005626463.localdomain podman[88208]: 2026-02-23 08:30:18.082398999 +0000 UTC m=+0.117002375 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:30:18 np0005626463.localdomain podman[88215]: 2026-02-23 08:30:18.087052406 +0000 UTC m=+0.118218674 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:30:18 np0005626463.localdomain podman[88207]: 2026-02-23 08:30:18.358312251 +0000 UTC m=+0.395046345 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:30:18 np0005626463.localdomain podman[88206]: 2026-02-23 08:30:18.362998199 +0000 UTC m=+0.403001866 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 23 08:30:18 np0005626463.localdomain podman[88206]: 2026-02-23 08:30:18.381276386 +0000 UTC m=+0.421280083 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:30:18 np0005626463.localdomain podman[88207]: 2026-02-23 08:30:18.408293419 +0000 UTC m=+0.445027493 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute)
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:30:18 np0005626463.localdomain podman[88209]: 2026-02-23 08:30:18.426216875 +0000 UTC m=+0.459314444 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 23 08:30:18 np0005626463.localdomain podman[88215]: 2026-02-23 08:30:18.432226385 +0000 UTC m=+0.463392653 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:30:18 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:30:19 np0005626463.localdomain systemd[1]: tmp-crun.RPix3z.mount: Deactivated successfully.
Feb 23 08:30:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:30:20 np0005626463.localdomain systemd[1]: tmp-crun.4Df6Fo.mount: Deactivated successfully.
Feb 23 08:30:20 np0005626463.localdomain podman[88320]: 2026-02-23 08:30:20.920797701 +0000 UTC m=+0.094621149 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:30:21 np0005626463.localdomain podman[88320]: 2026-02-23 08:30:21.341312319 +0000 UTC m=+0.515135777 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.openshift.expose-services=, container_name=nova_migration_target, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:30:21 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:30:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:30:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:30:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:30:22 np0005626463.localdomain podman[88346]: 2026-02-23 08:30:22.903521885 +0000 UTC m=+0.075035050 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:30:22 np0005626463.localdomain systemd[1]: tmp-crun.dtQzxd.mount: Deactivated successfully.
Feb 23 08:30:22 np0005626463.localdomain podman[88345]: 2026-02-23 08:30:22.922798404 +0000 UTC m=+0.096204690 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:30:22 np0005626463.localdomain podman[88345]: 2026-02-23 08:30:22.991798982 +0000 UTC m=+0.165205328 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 23 08:30:23 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:30:23 np0005626463.localdomain podman[88344]: 2026-02-23 08:30:22.997797451 +0000 UTC m=+0.172562239 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.5)
Feb 23 08:30:23 np0005626463.localdomain podman[88344]: 2026-02-23 08:30:23.082328171 +0000 UTC m=+0.257092949 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:30:23 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:30:23 np0005626463.localdomain podman[88346]: 2026-02-23 08:30:23.134478237 +0000 UTC m=+0.305991402 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:30:23 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:30:23 np0005626463.localdomain systemd[1]: tmp-crun.CtWVg0.mount: Deactivated successfully.
Feb 23 08:30:43 np0005626463.localdomain sshd[88464]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:30:43 np0005626463.localdomain sshd[88464]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:30:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:30:44 np0005626463.localdomain systemd[1]: tmp-crun.VI8bHF.mount: Deactivated successfully.
Feb 23 08:30:44 np0005626463.localdomain podman[88466]: 2026-02-23 08:30:44.937312306 +0000 UTC m=+0.104726437 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, architecture=x86_64)
Feb 23 08:30:44 np0005626463.localdomain podman[88466]: 2026-02-23 08:30:44.952355261 +0000 UTC m=+0.119769422 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z)
Feb 23 08:30:44 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:30:48 np0005626463.localdomain sshd[88485]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:30:48 np0005626463.localdomain podman[88489]: 2026-02-23 08:30:48.925614817 +0000 UTC m=+0.088758994 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron)
Feb 23 08:30:48 np0005626463.localdomain systemd[1]: tmp-crun.72JDQk.mount: Deactivated successfully.
Feb 23 08:30:48 np0005626463.localdomain podman[88487]: 2026-02-23 08:30:48.972029582 +0000 UTC m=+0.139877688 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:30:49 np0005626463.localdomain podman[88489]: 2026-02-23 08:30:49.011379904 +0000 UTC m=+0.174524051 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:30:49 np0005626463.localdomain podman[88487]: 2026-02-23 08:30:49.020951107 +0000 UTC m=+0.188799213 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64)
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:30:49 np0005626463.localdomain podman[88486]: 2026-02-23 08:30:49.030701285 +0000 UTC m=+0.198894071 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:30:49 np0005626463.localdomain podman[88486]: 2026-02-23 08:30:49.067339671 +0000 UTC m=+0.235532427 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:30:49 np0005626463.localdomain podman[88500]: 2026-02-23 08:30:48.950619507 +0000 UTC m=+0.103792248 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:30:49 np0005626463.localdomain podman[88488]: 2026-02-23 08:30:49.071465272 +0000 UTC m=+0.235375743 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git)
Feb 23 08:30:49 np0005626463.localdomain podman[88500]: 2026-02-23 08:30:49.134575094 +0000 UTC m=+0.287747865 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510)
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:30:49 np0005626463.localdomain podman[88488]: 2026-02-23 08:30:49.155397782 +0000 UTC m=+0.319308233 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:30:49 np0005626463.localdomain systemd[1]: tmp-crun.N2GObt.mount: Deactivated successfully.
Feb 23 08:30:50 np0005626463.localdomain sshd[88485]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:30:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:30:51 np0005626463.localdomain systemd[1]: tmp-crun.cQOpZe.mount: Deactivated successfully.
Feb 23 08:30:51 np0005626463.localdomain podman[88603]: 2026-02-23 08:30:51.916312447 +0000 UTC m=+0.089384754 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:30:52 np0005626463.localdomain podman[88603]: 2026-02-23 08:30:52.297058759 +0000 UTC m=+0.470131086 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510)
Feb 23 08:30:52 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:30:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:30:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:30:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:30:53 np0005626463.localdomain podman[88626]: 2026-02-23 08:30:53.911649349 +0000 UTC m=+0.076653681 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Feb 23 08:30:53 np0005626463.localdomain podman[88626]: 2026-02-23 08:30:53.963301571 +0000 UTC m=+0.128305943 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 23 08:30:53 np0005626463.localdomain systemd[1]: tmp-crun.zrVwQE.mount: Deactivated successfully.
Feb 23 08:30:53 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:30:53 np0005626463.localdomain podman[88628]: 2026-02-23 08:30:53.98513191 +0000 UTC m=+0.145465654 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team)
Feb 23 08:30:54 np0005626463.localdomain podman[88627]: 2026-02-23 08:30:54.02376985 +0000 UTC m=+0.187531032 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:30:54 np0005626463.localdomain podman[88627]: 2026-02-23 08:30:54.072344234 +0000 UTC m=+0.236105406 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 23 08:30:54 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:30:54 np0005626463.localdomain podman[88628]: 2026-02-23 08:30:54.199662003 +0000 UTC m=+0.359995717 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:30:54 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:31:10 np0005626463.localdomain sudo[88701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:31:10 np0005626463.localdomain sudo[88701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:31:10 np0005626463.localdomain sudo[88701]: pam_unix(sudo:session): session closed for user root
Feb 23 08:31:10 np0005626463.localdomain sudo[88716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:31:10 np0005626463.localdomain sudo[88716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:31:11 np0005626463.localdomain sudo[88716]: pam_unix(sudo:session): session closed for user root
Feb 23 08:31:12 np0005626463.localdomain sudo[88762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:31:12 np0005626463.localdomain sudo[88762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:31:12 np0005626463.localdomain sudo[88762]: pam_unix(sudo:session): session closed for user root
Feb 23 08:31:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:31:15 np0005626463.localdomain systemd[1]: tmp-crun.ZVfIIo.mount: Deactivated successfully.
Feb 23 08:31:15 np0005626463.localdomain podman[88777]: 2026-02-23 08:31:15.903531557 +0000 UTC m=+0.076209527 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:31:15 np0005626463.localdomain podman[88777]: 2026-02-23 08:31:15.915238687 +0000 UTC m=+0.087916687 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:31:15 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:31:19 np0005626463.localdomain systemd[1]: tmp-crun.shRjf6.mount: Deactivated successfully.
Feb 23 08:31:19 np0005626463.localdomain podman[88801]: 2026-02-23 08:31:19.983041557 +0000 UTC m=+0.146214898 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute)
Feb 23 08:31:19 np0005626463.localdomain podman[88797]: 2026-02-23 08:31:19.934915847 +0000 UTC m=+0.110453729 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com)
Feb 23 08:31:20 np0005626463.localdomain podman[88797]: 2026-02-23 08:31:20.014617563 +0000 UTC m=+0.190155375 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, version=17.1.13)
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:31:20 np0005626463.localdomain podman[88799]: 2026-02-23 08:31:20.02715019 +0000 UTC m=+0.197916631 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:31:20 np0005626463.localdomain podman[88800]: 2026-02-23 08:31:20.043193886 +0000 UTC m=+0.208754412 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:31:20 np0005626463.localdomain podman[88799]: 2026-02-23 08:31:20.05851177 +0000 UTC m=+0.229278241 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z)
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:31:20 np0005626463.localdomain podman[88798]: 2026-02-23 08:31:20.072076108 +0000 UTC m=+0.245961957 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:31:20 np0005626463.localdomain podman[88800]: 2026-02-23 08:31:20.126332001 +0000 UTC m=+0.291892517 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron)
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:31:20 np0005626463.localdomain podman[88801]: 2026-02-23 08:31:20.146133906 +0000 UTC m=+0.309307287 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:31:20 np0005626463.localdomain podman[88798]: 2026-02-23 08:31:20.180545363 +0000 UTC m=+0.354431202 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:31:20 np0005626463.localdomain sshd[88912]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:31:20 np0005626463.localdomain systemd[1]: tmp-crun.LaLP9m.mount: Deactivated successfully.
Feb 23 08:31:21 np0005626463.localdomain sshd[88912]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:31:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:31:22 np0005626463.localdomain systemd[1]: tmp-crun.CrN1nI.mount: Deactivated successfully.
Feb 23 08:31:22 np0005626463.localdomain podman[88914]: 2026-02-23 08:31:22.929387748 +0000 UTC m=+0.099373890 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:31:23 np0005626463.localdomain podman[88914]: 2026-02-23 08:31:23.309416843 +0000 UTC m=+0.479402965 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:31:23 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:31:24 np0005626463.localdomain systemd[1]: tmp-crun.QoGgeI.mount: Deactivated successfully.
Feb 23 08:31:24 np0005626463.localdomain systemd[1]: tmp-crun.mgwdLU.mount: Deactivated successfully.
Feb 23 08:31:24 np0005626463.localdomain podman[88938]: 2026-02-23 08:31:24.975236067 +0000 UTC m=+0.150085011 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z)
Feb 23 08:31:25 np0005626463.localdomain podman[88940]: 2026-02-23 08:31:25.024756841 +0000 UTC m=+0.194518906 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:31:25 np0005626463.localdomain podman[88939]: 2026-02-23 08:31:24.940517647 +0000 UTC m=+0.115032290 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git)
Feb 23 08:31:25 np0005626463.localdomain podman[88938]: 2026-02-23 08:31:25.052806311 +0000 UTC m=+0.227655275 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4)
Feb 23 08:31:25 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:31:25 np0005626463.localdomain podman[88939]: 2026-02-23 08:31:25.072257351 +0000 UTC m=+0.246771965 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, batch=17.1_20260112.1)
Feb 23 08:31:25 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:31:25 np0005626463.localdomain podman[88940]: 2026-02-23 08:31:25.202326443 +0000 UTC m=+0.372088528 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510)
Feb 23 08:31:25 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:31:41 np0005626463.localdomain sshd[89035]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:31:41 np0005626463.localdomain sshd[89035]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:31:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:31:46 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:31:46 np0005626463.localdomain recover_tripleo_nova_virtqemud[89041]: 61982
Feb 23 08:31:46 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:31:46 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:31:46 np0005626463.localdomain podman[89037]: 2026-02-23 08:31:46.915262847 +0000 UTC m=+0.092619148 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container)
Feb 23 08:31:46 np0005626463.localdomain podman[89037]: 2026-02-23 08:31:46.925277451 +0000 UTC m=+0.102633752 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, architecture=x86_64)
Feb 23 08:31:46 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: tmp-crun.2b3ZlK.mount: Deactivated successfully.
Feb 23 08:31:50 np0005626463.localdomain podman[89060]: 2026-02-23 08:31:50.909839207 +0000 UTC m=+0.078042990 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public)
Feb 23 08:31:50 np0005626463.localdomain podman[89059]: 2026-02-23 08:31:50.96891301 +0000 UTC m=+0.136978909 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:31:50 np0005626463.localdomain podman[89059]: 2026-02-23 08:31:50.976715915 +0000 UTC m=+0.144781824 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, release=1766032510, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Feb 23 08:31:50 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:31:50 np0005626463.localdomain podman[89060]: 2026-02-23 08:31:50.990409955 +0000 UTC m=+0.158613748 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:31:50 np0005626463.localdomain podman[89062]: 2026-02-23 08:31:50.945498176 +0000 UTC m=+0.102873040 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:31:51 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:31:51 np0005626463.localdomain podman[89062]: 2026-02-23 08:31:51.02720181 +0000 UTC m=+0.184576684 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, release=1766032510, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc.)
Feb 23 08:31:51 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:31:51 np0005626463.localdomain podman[89061]: 2026-02-23 08:31:51.076500086 +0000 UTC m=+0.237857305 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:31:51 np0005626463.localdomain podman[89061]: 2026-02-23 08:31:51.13239313 +0000 UTC m=+0.293750329 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:31:51 np0005626463.localdomain podman[89073]: 2026-02-23 08:31:51.140795774 +0000 UTC m=+0.290553928 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64)
Feb 23 08:31:51 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:31:51 np0005626463.localdomain podman[89073]: 2026-02-23 08:31:51.175299547 +0000 UTC m=+0.325057691 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64)
Feb 23 08:31:51 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:31:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:31:53 np0005626463.localdomain systemd[1]: tmp-crun.tLDK2e.mount: Deactivated successfully.
Feb 23 08:31:53 np0005626463.localdomain podman[89176]: 2026-02-23 08:31:53.920422827 +0000 UTC m=+0.094268408 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:31:54 np0005626463.localdomain podman[89176]: 2026-02-23 08:31:54.350210385 +0000 UTC m=+0.524056006 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z)
Feb 23 08:31:54 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:31:55 np0005626463.localdomain podman[89200]: 2026-02-23 08:31:55.900894945 +0000 UTC m=+0.072889508 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:31:55 np0005626463.localdomain systemd[1]: tmp-crun.u4W6rI.mount: Deactivated successfully.
Feb 23 08:31:55 np0005626463.localdomain podman[89199]: 2026-02-23 08:31:55.970274553 +0000 UTC m=+0.144327120 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, tcib_managed=true, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z)
Feb 23 08:31:55 np0005626463.localdomain podman[89200]: 2026-02-23 08:31:55.97336945 +0000 UTC m=+0.145364003 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:31:55 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:31:56 np0005626463.localdomain podman[89199]: 2026-02-23 08:31:56.019385894 +0000 UTC m=+0.193438431 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, release=1766032510, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:31:56 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:31:56 np0005626463.localdomain podman[89201]: 2026-02-23 08:31:56.067620807 +0000 UTC m=+0.235933845 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, tcib_managed=true)
Feb 23 08:31:56 np0005626463.localdomain podman[89201]: 2026-02-23 08:31:56.285482094 +0000 UTC m=+0.453795151 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:31:56 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:31:59 np0005626463.localdomain sshd[89274]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:31:59 np0005626463.localdomain sshd[89274]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:32:12 np0005626463.localdomain sudo[89276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:32:12 np0005626463.localdomain sudo[89276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:32:12 np0005626463.localdomain sudo[89276]: pam_unix(sudo:session): session closed for user root
Feb 23 08:32:12 np0005626463.localdomain sudo[89291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:32:12 np0005626463.localdomain sudo[89291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:32:13 np0005626463.localdomain podman[89379]: 2026-02-23 08:32:13.171164799 +0000 UTC m=+0.083711928 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 08:32:13 np0005626463.localdomain podman[89379]: 2026-02-23 08:32:13.308489688 +0000 UTC m=+0.221036867 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, release=1770267347)
Feb 23 08:32:13 np0005626463.localdomain sudo[89291]: pam_unix(sudo:session): session closed for user root
Feb 23 08:32:13 np0005626463.localdomain sudo[89445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:32:13 np0005626463.localdomain sudo[89445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:32:13 np0005626463.localdomain sudo[89445]: pam_unix(sudo:session): session closed for user root
Feb 23 08:32:13 np0005626463.localdomain sudo[89460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:32:13 np0005626463.localdomain sudo[89460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:32:14 np0005626463.localdomain sudo[89460]: pam_unix(sudo:session): session closed for user root
Feb 23 08:32:14 np0005626463.localdomain sudo[89507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:32:14 np0005626463.localdomain sudo[89507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:32:14 np0005626463.localdomain sudo[89507]: pam_unix(sudo:session): session closed for user root
Feb 23 08:32:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:32:17 np0005626463.localdomain podman[89522]: 2026-02-23 08:32:17.92896796 +0000 UTC m=+0.099577286 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:32:17 np0005626463.localdomain podman[89522]: 2026-02-23 08:32:17.963805132 +0000 UTC m=+0.134414318 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd)
Feb 23 08:32:17 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: tmp-crun.Ztkzj6.mount: Deactivated successfully.
Feb 23 08:32:21 np0005626463.localdomain podman[89545]: 2026-02-23 08:32:21.915431655 +0000 UTC m=+0.082388727 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1)
Feb 23 08:32:21 np0005626463.localdomain podman[89543]: 2026-02-23 08:32:21.964885626 +0000 UTC m=+0.140521381 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:32:21 np0005626463.localdomain podman[89545]: 2026-02-23 08:32:21.969297865 +0000 UTC m=+0.136254957 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 23 08:32:21 np0005626463.localdomain podman[89543]: 2026-02-23 08:32:21.978353368 +0000 UTC m=+0.153989143 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, url=https://www.redhat.com, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:32:21 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:32:22 np0005626463.localdomain podman[89553]: 2026-02-23 08:32:22.021987788 +0000 UTC m=+0.180850856 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step5, container_name=nova_compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:32:22 np0005626463.localdomain podman[89553]: 2026-02-23 08:32:22.059966669 +0000 UTC m=+0.218829737 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:32:22 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:32:22 np0005626463.localdomain podman[89544]: 2026-02-23 08:32:22.076094056 +0000 UTC m=+0.246297470 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, distribution-scope=public)
Feb 23 08:32:22 np0005626463.localdomain podman[89544]: 2026-02-23 08:32:22.107179451 +0000 UTC m=+0.277382875 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5)
Feb 23 08:32:22 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:32:22 np0005626463.localdomain podman[89551]: 2026-02-23 08:32:22.125815376 +0000 UTC m=+0.285879802 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 23 08:32:22 np0005626463.localdomain podman[89551]: 2026-02-23 08:32:22.131078752 +0000 UTC m=+0.291143168 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:32:22 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:32:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:32:24 np0005626463.localdomain podman[89662]: 2026-02-23 08:32:24.894169548 +0000 UTC m=+0.068908714 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:32:25 np0005626463.localdomain podman[89662]: 2026-02-23 08:32:25.272999075 +0000 UTC m=+0.447738221 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:32:25 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:32:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:32:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:32:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:32:26 np0005626463.localdomain podman[89684]: 2026-02-23 08:32:26.906834785 +0000 UTC m=+0.084909455 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:32:26 np0005626463.localdomain systemd[1]: tmp-crun.5EUDWj.mount: Deactivated successfully.
Feb 23 08:32:26 np0005626463.localdomain podman[89685]: 2026-02-23 08:32:26.965095433 +0000 UTC m=+0.138638361 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 23 08:32:27 np0005626463.localdomain systemd[1]: tmp-crun.qNAicI.mount: Deactivated successfully.
Feb 23 08:32:27 np0005626463.localdomain podman[89686]: 2026-02-23 08:32:27.026935344 +0000 UTC m=+0.198880682 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr)
Feb 23 08:32:27 np0005626463.localdomain podman[89684]: 2026-02-23 08:32:27.032493358 +0000 UTC m=+0.210568078 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 23 08:32:27 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:32:27 np0005626463.localdomain podman[89685]: 2026-02-23 08:32:27.053746456 +0000 UTC m=+0.227289374 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 23 08:32:27 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:32:27 np0005626463.localdomain podman[89686]: 2026-02-23 08:32:27.217316758 +0000 UTC m=+0.389262066 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:32:27 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:32:32 np0005626463.localdomain sshd[89759]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:32:33 np0005626463.localdomain sshd[89759]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:32:40 np0005626463.localdomain sshd[89784]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:32:40 np0005626463.localdomain sshd[89784]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:32:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:32:48 np0005626463.localdomain podman[89786]: 2026-02-23 08:32:48.922505075 +0000 UTC m=+0.086729951 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 23 08:32:48 np0005626463.localdomain podman[89786]: 2026-02-23 08:32:48.933261744 +0000 UTC m=+0.097486610 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team)
Feb 23 08:32:48 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:32:52 np0005626463.localdomain recover_tripleo_nova_virtqemud[89837]: 61982
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: tmp-crun.PRM2Ir.mount: Deactivated successfully.
Feb 23 08:32:52 np0005626463.localdomain podman[89809]: 2026-02-23 08:32:52.930811154 +0000 UTC m=+0.097660726 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5)
Feb 23 08:32:52 np0005626463.localdomain podman[89809]: 2026-02-23 08:32:52.946380934 +0000 UTC m=+0.113230486 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, version=17.1.13, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:32:52 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:32:52 np0005626463.localdomain podman[89806]: 2026-02-23 08:32:52.987337088 +0000 UTC m=+0.160252192 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, version=17.1.13, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, release=1766032510, container_name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:32:52 np0005626463.localdomain podman[89815]: 2026-02-23 08:32:52.942303257 +0000 UTC m=+0.101555845 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:32:53 np0005626463.localdomain podman[89806]: 2026-02-23 08:32:53.025212861 +0000 UTC m=+0.198127995 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13)
Feb 23 08:32:53 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:32:53 np0005626463.localdomain podman[89815]: 2026-02-23 08:32:53.075412374 +0000 UTC m=+0.234665032 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:32:53 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:32:53 np0005626463.localdomain podman[89807]: 2026-02-23 08:32:53.095590086 +0000 UTC m=+0.263932887 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4)
Feb 23 08:32:53 np0005626463.localdomain podman[89808]: 2026-02-23 08:32:53.026836944 +0000 UTC m=+0.194286695 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:32:53 np0005626463.localdomain podman[89808]: 2026-02-23 08:32:53.158037547 +0000 UTC m=+0.325487308 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4)
Feb 23 08:32:53 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:32:53 np0005626463.localdomain podman[89807]: 2026-02-23 08:32:53.180271568 +0000 UTC m=+0.348614419 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13)
Feb 23 08:32:53 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:32:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:32:55 np0005626463.localdomain systemd[1]: tmp-crun.dezNwA.mount: Deactivated successfully.
Feb 23 08:32:55 np0005626463.localdomain podman[89925]: 2026-02-23 08:32:55.923774413 +0000 UTC m=+0.093497687 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:32:56 np0005626463.localdomain podman[89925]: 2026-02-23 08:32:56.357409606 +0000 UTC m=+0.527132850 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 23 08:32:56 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:32:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:32:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:32:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:32:57 np0005626463.localdomain systemd[1]: tmp-crun.WWaX8D.mount: Deactivated successfully.
Feb 23 08:32:57 np0005626463.localdomain podman[89950]: 2026-02-23 08:32:57.921644059 +0000 UTC m=+0.087008401 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:32:57 np0005626463.localdomain podman[89949]: 2026-02-23 08:32:57.980334745 +0000 UTC m=+0.148794291 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, tcib_managed=true)
Feb 23 08:32:58 np0005626463.localdomain podman[89949]: 2026-02-23 08:32:58.035334477 +0000 UTC m=+0.203794053 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:32:58 np0005626463.localdomain podman[89948]: 2026-02-23 08:32:58.078809287 +0000 UTC m=+0.249156455 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git)
Feb 23 08:32:58 np0005626463.localdomain podman[89948]: 2026-02-23 08:32:58.104232534 +0000 UTC m=+0.274579692 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:32:58 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:32:58 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:32:58 np0005626463.localdomain podman[89950]: 2026-02-23 08:32:58.128330117 +0000 UTC m=+0.293694379 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:32:58 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:32:58 np0005626463.localdomain systemd[1]: tmp-crun.RwNjmR.mount: Deactivated successfully.
Feb 23 08:33:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=46574 SEQ=0 ACK=574472157 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:33:02 np0005626463.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:9a:b6:c6 MACPROTO=0800 SRC=167.248.133.126 DST=38.102.83.164 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=0 PROTO=TCP SPT=25857 DPT=19885 SEQ=4029011176 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40303070402080A699C10BE0000000000) 
Feb 23 08:33:15 np0005626463.localdomain sudo[90026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:33:15 np0005626463.localdomain sudo[90026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:33:15 np0005626463.localdomain sudo[90026]: pam_unix(sudo:session): session closed for user root
Feb 23 08:33:15 np0005626463.localdomain sudo[90041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:33:15 np0005626463.localdomain sudo[90041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:33:15 np0005626463.localdomain sudo[90041]: pam_unix(sudo:session): session closed for user root
Feb 23 08:33:16 np0005626463.localdomain sudo[90087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:33:16 np0005626463.localdomain sudo[90087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:33:16 np0005626463.localdomain sudo[90087]: pam_unix(sudo:session): session closed for user root
Feb 23 08:33:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:33:20 np0005626463.localdomain systemd[1]: tmp-crun.JEmfzM.mount: Deactivated successfully.
Feb 23 08:33:20 np0005626463.localdomain podman[90102]: 2026-02-23 08:33:20.037486447 +0000 UTC m=+0.209583957 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:33:20 np0005626463.localdomain podman[90102]: 2026-02-23 08:33:20.090294587 +0000 UTC m=+0.262392067 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:33:20 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:33:21 np0005626463.localdomain sshd[90122]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:33:21 np0005626463.localdomain sshd[90122]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:33:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:33:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:33:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:33:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:33:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: tmp-crun.xALL3z.mount: Deactivated successfully.
Feb 23 08:33:24 np0005626463.localdomain podman[90124]: 2026-02-23 08:33:24.060797335 +0000 UTC m=+0.220678185 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:33:24 np0005626463.localdomain podman[90125]: 2026-02-23 08:33:24.151998725 +0000 UTC m=+0.310090196 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc.)
Feb 23 08:33:24 np0005626463.localdomain podman[90127]: 2026-02-23 08:33:24.014047238 +0000 UTC m=+0.165321661 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Feb 23 08:33:24 np0005626463.localdomain podman[90125]: 2026-02-23 08:33:24.212492122 +0000 UTC m=+0.370583603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, url=https://www.redhat.com, tcib_managed=true)
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:33:24 np0005626463.localdomain podman[90127]: 2026-02-23 08:33:24.300818895 +0000 UTC m=+0.452093338 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:33:24 np0005626463.localdomain podman[90126]: 2026-02-23 08:33:24.118443457 +0000 UTC m=+0.274142358 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:33:24 np0005626463.localdomain podman[90126]: 2026-02-23 08:33:24.413288233 +0000 UTC m=+0.568987084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:33:24 np0005626463.localdomain podman[90128]: 2026-02-23 08:33:23.919069662 +0000 UTC m=+0.073171819 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5)
Feb 23 08:33:24 np0005626463.localdomain podman[90124]: 2026-02-23 08:33:24.526486337 +0000 UTC m=+0.686367217 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:33:24 np0005626463.localdomain podman[90128]: 2026-02-23 08:33:24.586352721 +0000 UTC m=+0.740454918 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Feb 23 08:33:24 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:33:26 np0005626463.localdomain sshd[90236]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:33:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:33:26 np0005626463.localdomain podman[90238]: 2026-02-23 08:33:26.910927926 +0000 UTC m=+0.080296998 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 23 08:33:27 np0005626463.localdomain podman[90238]: 2026-02-23 08:33:27.322558905 +0000 UTC m=+0.491927977 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 23 08:33:27 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:33:27 np0005626463.localdomain sshd[90236]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:33:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:33:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:33:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:33:28 np0005626463.localdomain systemd[1]: tmp-crun.k5XMCe.mount: Deactivated successfully.
Feb 23 08:33:28 np0005626463.localdomain podman[90261]: 2026-02-23 08:33:28.92110285 +0000 UTC m=+0.097865773 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:33:28 np0005626463.localdomain podman[90261]: 2026-02-23 08:33:28.946522007 +0000 UTC m=+0.123284900 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container)
Feb 23 08:33:28 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:33:28 np0005626463.localdomain podman[90263]: 2026-02-23 08:33:28.971464298 +0000 UTC m=+0.142085715 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Feb 23 08:33:29 np0005626463.localdomain podman[90262]: 2026-02-23 08:33:29.01469017 +0000 UTC m=+0.186685724 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z)
Feb 23 08:33:29 np0005626463.localdomain podman[90262]: 2026-02-23 08:33:29.054236437 +0000 UTC m=+0.226232001 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 23 08:33:29 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:33:29 np0005626463.localdomain podman[90263]: 2026-02-23 08:33:29.175392615 +0000 UTC m=+0.346014092 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:33:29 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:33:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:33:50 np0005626463.localdomain podman[90358]: 2026-02-23 08:33:50.907027475 +0000 UTC m=+0.076847513 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true)
Feb 23 08:33:50 np0005626463.localdomain podman[90358]: 2026-02-23 08:33:50.916623555 +0000 UTC m=+0.086443573 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:33:50 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: tmp-crun.lGuv9f.mount: Deactivated successfully.
Feb 23 08:33:54 np0005626463.localdomain podman[90381]: 2026-02-23 08:33:54.956860078 +0000 UTC m=+0.095027028 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 23 08:33:54 np0005626463.localdomain podman[90381]: 2026-02-23 08:33:54.982313306 +0000 UTC m=+0.120480246 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:33:54 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:33:55 np0005626463.localdomain podman[90383]: 2026-02-23 08:33:55.038126167 +0000 UTC m=+0.197785613 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:33:55 np0005626463.localdomain podman[90383]: 2026-02-23 08:33:55.051216883 +0000 UTC m=+0.210876329 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:33:55 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:33:55 np0005626463.localdomain podman[90380]: 2026-02-23 08:33:54.938392063 +0000 UTC m=+0.108811497 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:33:55 np0005626463.localdomain podman[90382]: 2026-02-23 08:33:55.097731333 +0000 UTC m=+0.261157015 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 23 08:33:55 np0005626463.localdomain podman[90389]: 2026-02-23 08:33:55.150350887 +0000 UTC m=+0.309225487 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true)
Feb 23 08:33:55 np0005626463.localdomain podman[90382]: 2026-02-23 08:33:55.167350033 +0000 UTC m=+0.330775695 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:33:55 np0005626463.localdomain podman[90380]: 2026-02-23 08:33:55.17474109 +0000 UTC m=+0.345160554 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:33:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:33:55 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:33:55 np0005626463.localdomain podman[90389]: 2026-02-23 08:33:55.207341727 +0000 UTC m=+0.366216307 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:33:55 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:33:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:33:57 np0005626463.localdomain podman[90494]: 2026-02-23 08:33:57.903013808 +0000 UTC m=+0.081315821 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc.)
Feb 23 08:33:58 np0005626463.localdomain podman[90494]: 2026-02-23 08:33:58.371068767 +0000 UTC m=+0.549370750 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, release=1766032510)
Feb 23 08:33:58 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:33:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:33:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:33:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:33:59 np0005626463.localdomain podman[90520]: 2026-02-23 08:33:59.946653758 +0000 UTC m=+0.122842944 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com)
Feb 23 08:33:59 np0005626463.localdomain podman[90520]: 2026-02-23 08:33:59.969186599 +0000 UTC m=+0.145375805 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5)
Feb 23 08:33:59 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:34:00 np0005626463.localdomain systemd[1]: tmp-crun.HB2RQl.mount: Deactivated successfully.
Feb 23 08:34:00 np0005626463.localdomain podman[90521]: 2026-02-23 08:34:00.061977032 +0000 UTC m=+0.235166299 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team)
Feb 23 08:34:00 np0005626463.localdomain podman[90521]: 2026-02-23 08:34:00.101667055 +0000 UTC m=+0.274856372 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 23 08:34:00 np0005626463.localdomain podman[90522]: 2026-02-23 08:34:00.115211336 +0000 UTC m=+0.284127091 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Feb 23 08:34:00 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:34:00 np0005626463.localdomain podman[90522]: 2026-02-23 08:34:00.31246108 +0000 UTC m=+0.481376815 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:34:00 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:34:02 np0005626463.localdomain sshd[90596]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:34:03 np0005626463.localdomain sshd[90596]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:34:16 np0005626463.localdomain sudo[90599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:34:16 np0005626463.localdomain sudo[90599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:34:16 np0005626463.localdomain sudo[90599]: pam_unix(sudo:session): session closed for user root
Feb 23 08:34:16 np0005626463.localdomain sudo[90614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:34:16 np0005626463.localdomain sudo[90614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:34:17 np0005626463.localdomain sudo[90614]: pam_unix(sudo:session): session closed for user root
Feb 23 08:34:18 np0005626463.localdomain sudo[90661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:34:18 np0005626463.localdomain sudo[90661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:34:18 np0005626463.localdomain sudo[90661]: pam_unix(sudo:session): session closed for user root
Feb 23 08:34:18 np0005626463.localdomain sshd[90676]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:34:20 np0005626463.localdomain sshd[90676]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:34:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:34:21 np0005626463.localdomain podman[90678]: 2026-02-23 08:34:21.935009232 +0000 UTC m=+0.090936175 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, distribution-scope=public)
Feb 23 08:34:21 np0005626463.localdomain podman[90678]: 2026-02-23 08:34:21.947072384 +0000 UTC m=+0.102999297 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-collectd, tcib_managed=true)
Feb 23 08:34:21 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:34:22 np0005626463.localdomain sshd[90698]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:34:24 np0005626463.localdomain sshd[90698]: Invalid user sshadmin from 185.156.73.233 port 19758
Feb 23 08:34:24 np0005626463.localdomain sshd[90698]: Connection closed by invalid user sshadmin 185.156.73.233 port 19758 [preauth]
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:34:25 np0005626463.localdomain podman[90700]: 2026-02-23 08:34:25.916319044 +0000 UTC m=+0.089527373 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 23 08:34:25 np0005626463.localdomain podman[90708]: 2026-02-23 08:34:25.962348618 +0000 UTC m=+0.125982293 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:34:25 np0005626463.localdomain systemd[1]: tmp-crun.9pf08y.mount: Deactivated successfully.
Feb 23 08:34:26 np0005626463.localdomain podman[90702]: 2026-02-23 08:34:26.020942159 +0000 UTC m=+0.185404229 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:34:26 np0005626463.localdomain podman[90709]: 2026-02-23 08:34:25.985250262 +0000 UTC m=+0.141793149 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:34:26 np0005626463.localdomain podman[90708]: 2026-02-23 08:34:26.045381171 +0000 UTC m=+0.209014876 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container)
Feb 23 08:34:26 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:34:26 np0005626463.localdomain podman[90709]: 2026-02-23 08:34:26.07332089 +0000 UTC m=+0.229863737 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Feb 23 08:34:26 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:34:26 np0005626463.localdomain podman[90700]: 2026-02-23 08:34:26.099445392 +0000 UTC m=+0.272653731 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.buildah.version=1.41.5, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 23 08:34:26 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:34:26 np0005626463.localdomain podman[90702]: 2026-02-23 08:34:26.129466575 +0000 UTC m=+0.293928645 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, config_id=tripleo_step4)
Feb 23 08:34:26 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:34:26 np0005626463.localdomain podman[90701]: 2026-02-23 08:34:26.20153707 +0000 UTC m=+0.366428643 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 23 08:34:26 np0005626463.localdomain podman[90701]: 2026-02-23 08:34:26.238347722 +0000 UTC m=+0.403239275 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:34:26 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:34:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:34:28 np0005626463.localdomain podman[90816]: 2026-02-23 08:34:28.9214352 +0000 UTC m=+0.095397063 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z)
Feb 23 08:34:29 np0005626463.localdomain podman[90816]: 2026-02-23 08:34:29.339520741 +0000 UTC m=+0.513482624 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_migration_target)
Feb 23 08:34:29 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:34:30 np0005626463.localdomain recover_tripleo_nova_virtqemud[90860]: 61982
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: tmp-crun.WrMliu.mount: Deactivated successfully.
Feb 23 08:34:30 np0005626463.localdomain podman[90843]: 2026-02-23 08:34:30.926027145 +0000 UTC m=+0.092969039 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510)
Feb 23 08:34:30 np0005626463.localdomain systemd[1]: tmp-crun.9CkvkQ.mount: Deactivated successfully.
Feb 23 08:34:30 np0005626463.localdomain podman[90842]: 2026-02-23 08:34:30.985175212 +0000 UTC m=+0.153808398 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:34:31 np0005626463.localdomain podman[90841]: 2026-02-23 08:34:31.031409134 +0000 UTC m=+0.203918919 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:34:31 np0005626463.localdomain podman[90842]: 2026-02-23 08:34:31.035096727 +0000 UTC m=+0.203729933 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=)
Feb 23 08:34:31 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:34:31 np0005626463.localdomain podman[90841]: 2026-02-23 08:34:31.066219714 +0000 UTC m=+0.238729479 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc.)
Feb 23 08:34:31 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:34:31 np0005626463.localdomain podman[90843]: 2026-02-23 08:34:31.130070006 +0000 UTC m=+0.297011810 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 23 08:34:31 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:34:44 np0005626463.localdomain sshd[90942]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:34:44 np0005626463.localdomain sshd[90942]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:34:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:34:52 np0005626463.localdomain systemd[1]: tmp-crun.FfSYdT.mount: Deactivated successfully.
Feb 23 08:34:52 np0005626463.localdomain podman[90944]: 2026-02-23 08:34:52.918576575 +0000 UTC m=+0.088085251 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, container_name=collectd)
Feb 23 08:34:52 np0005626463.localdomain podman[90944]: 2026-02-23 08:34:52.932184279 +0000 UTC m=+0.101692925 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 23 08:34:52 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: tmp-crun.dXElnZ.mount: Deactivated successfully.
Feb 23 08:34:56 np0005626463.localdomain podman[90969]: 2026-02-23 08:34:56.930104653 +0000 UTC m=+0.094893628 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13)
Feb 23 08:34:56 np0005626463.localdomain systemd[1]: tmp-crun.yn3lHb.mount: Deactivated successfully.
Feb 23 08:34:56 np0005626463.localdomain podman[90964]: 2026-02-23 08:34:56.962149786 +0000 UTC m=+0.134007177 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 23 08:34:56 np0005626463.localdomain podman[90965]: 2026-02-23 08:34:56.938226672 +0000 UTC m=+0.102799520 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 23 08:34:57 np0005626463.localdomain podman[90965]: 2026-02-23 08:34:57.02434187 +0000 UTC m=+0.188914738 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:34:57 np0005626463.localdomain podman[90966]: 2026-02-23 08:34:56.982712352 +0000 UTC m=+0.144224602 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=)
Feb 23 08:34:57 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:34:57 np0005626463.localdomain podman[90973]: 2026-02-23 08:34:57.039962769 +0000 UTC m=+0.195610042 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:34:57 np0005626463.localdomain podman[90969]: 2026-02-23 08:34:57.064547953 +0000 UTC m=+0.229336908 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4)
Feb 23 08:34:57 np0005626463.localdomain podman[90973]: 2026-02-23 08:34:57.06820122 +0000 UTC m=+0.223848513 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:34:57 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:34:57 np0005626463.localdomain podman[90964]: 2026-02-23 08:34:57.093474626 +0000 UTC m=+0.265332017 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:34:57 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:34:57 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:34:57 np0005626463.localdomain podman[90966]: 2026-02-23 08:34:57.169368077 +0000 UTC m=+0.330880307 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:34:57 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:34:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:34:59 np0005626463.localdomain podman[91085]: 2026-02-23 08:34:59.905900877 +0000 UTC m=+0.082202815 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:35:00 np0005626463.localdomain podman[91085]: 2026-02-23 08:35:00.289576798 +0000 UTC m=+0.465878726 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:35:00 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:35:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:35:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:35:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:35:01 np0005626463.localdomain podman[91108]: 2026-02-23 08:35:01.917752565 +0000 UTC m=+0.096327215 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:35:01 np0005626463.localdomain podman[91108]: 2026-02-23 08:35:01.96839479 +0000 UTC m=+0.146969440 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:35:01 np0005626463.localdomain systemd[1]: tmp-crun.hGufWv.mount: Deactivated successfully.
Feb 23 08:35:01 np0005626463.localdomain podman[91109]: 2026-02-23 08:35:01.977587433 +0000 UTC m=+0.151230136 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:35:01 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:35:02 np0005626463.localdomain podman[91109]: 2026-02-23 08:35:02.025788741 +0000 UTC m=+0.199431404 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, release=1766032510)
Feb 23 08:35:02 np0005626463.localdomain podman[91110]: 2026-02-23 08:35:02.035266054 +0000 UTC m=+0.207208252 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:35:02 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:35:02 np0005626463.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:9a:b6:c6 MACPROTO=0800 SRC=167.248.133.126 DST=38.102.83.164 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=0 PROTO=TCP SPT=46677 DPT=19885 SEQ=3435006332 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40303070402080A699C11360000000000) 
Feb 23 08:35:02 np0005626463.localdomain podman[91110]: 2026-02-23 08:35:02.250520482 +0000 UTC m=+0.422462610 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step1, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 23 08:35:02 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:35:07 np0005626463.localdomain sshd[91182]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:35:09 np0005626463.localdomain sshd[91182]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:35:18 np0005626463.localdomain sudo[91184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:35:18 np0005626463.localdomain sudo[91184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:35:18 np0005626463.localdomain sudo[91184]: pam_unix(sudo:session): session closed for user root
Feb 23 08:35:18 np0005626463.localdomain sudo[91199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:35:18 np0005626463.localdomain sudo[91199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:35:19 np0005626463.localdomain sudo[91199]: pam_unix(sudo:session): session closed for user root
Feb 23 08:35:19 np0005626463.localdomain sudo[91246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:35:19 np0005626463.localdomain sudo[91246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:35:19 np0005626463.localdomain sudo[91246]: pam_unix(sudo:session): session closed for user root
Feb 23 08:35:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:35:23 np0005626463.localdomain podman[91261]: 2026-02-23 08:35:23.911055723 +0000 UTC m=+0.083465284 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:35:23 np0005626463.localdomain podman[91261]: 2026-02-23 08:35:23.952477444 +0000 UTC m=+0.124886995 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:35:23 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:35:24 np0005626463.localdomain sshd[91281]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:35:25 np0005626463.localdomain sshd[91281]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: tmp-crun.XvzbRx.mount: Deactivated successfully.
Feb 23 08:35:27 np0005626463.localdomain podman[91285]: 2026-02-23 08:35:27.929269146 +0000 UTC m=+0.091617025 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:35:27 np0005626463.localdomain podman[91285]: 2026-02-23 08:35:27.965223103 +0000 UTC m=+0.127570942 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git)
Feb 23 08:35:27 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:35:27 np0005626463.localdomain podman[91291]: 2026-02-23 08:35:27.987447442 +0000 UTC m=+0.145168883 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z)
Feb 23 08:35:28 np0005626463.localdomain podman[91291]: 2026-02-23 08:35:28.018266455 +0000 UTC m=+0.175987896 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute)
Feb 23 08:35:28 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:35:28 np0005626463.localdomain podman[91284]: 2026-02-23 08:35:28.030539077 +0000 UTC m=+0.194211407 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:35:28 np0005626463.localdomain podman[91283]: 2026-02-23 08:35:28.083470276 +0000 UTC m=+0.251865387 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step3)
Feb 23 08:35:28 np0005626463.localdomain podman[91284]: 2026-02-23 08:35:28.091357737 +0000 UTC m=+0.255030127 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git)
Feb 23 08:35:28 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:35:28 np0005626463.localdomain podman[91286]: 2026-02-23 08:35:28.143999527 +0000 UTC m=+0.301068617 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, distribution-scope=public)
Feb 23 08:35:28 np0005626463.localdomain podman[91286]: 2026-02-23 08:35:28.155177594 +0000 UTC m=+0.312246664 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 23 08:35:28 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:35:28 np0005626463.localdomain podman[91283]: 2026-02-23 08:35:28.1751201 +0000 UTC m=+0.343515221 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, vcs-type=git, release=1766032510, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public)
Feb 23 08:35:28 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:35:28 np0005626463.localdomain systemd[1]: tmp-crun.OGlHXQ.mount: Deactivated successfully.
Feb 23 08:35:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:35:30 np0005626463.localdomain podman[91396]: 2026-02-23 08:35:30.897593642 +0000 UTC m=+0.075537131 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 23 08:35:31 np0005626463.localdomain podman[91396]: 2026-02-23 08:35:31.253501307 +0000 UTC m=+0.431444806 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:35:31 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:35:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:35:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:35:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:35:32 np0005626463.localdomain podman[91420]: 2026-02-23 08:35:32.920139252 +0000 UTC m=+0.088354500 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Feb 23 08:35:32 np0005626463.localdomain podman[91419]: 2026-02-23 08:35:32.903951895 +0000 UTC m=+0.077555615 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:35:32 np0005626463.localdomain podman[91420]: 2026-02-23 08:35:32.969198347 +0000 UTC m=+0.137413575 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:35:32 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:35:32 np0005626463.localdomain podman[91419]: 2026-02-23 08:35:32.988264865 +0000 UTC m=+0.161868555 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public)
Feb 23 08:35:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:35:33 np0005626463.localdomain podman[91421]: 2026-02-23 08:35:32.972030067 +0000 UTC m=+0.136364232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, release=1766032510, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:35:33 np0005626463.localdomain podman[91421]: 2026-02-23 08:35:33.15326069 +0000 UTC m=+0.317594905 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, release=1766032510, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Feb 23 08:35:33 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:35:33 np0005626463.localdomain systemd[1]: tmp-crun.RmNRg3.mount: Deactivated successfully.
Feb 23 08:35:47 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:35:47 np0005626463.localdomain recover_tripleo_nova_virtqemud[91515]: 61982
Feb 23 08:35:47 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:35:47 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:35:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:35:54 np0005626463.localdomain podman[91516]: 2026-02-23 08:35:54.923569455 +0000 UTC m=+0.094481236 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, container_name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 23 08:35:54 np0005626463.localdomain podman[91516]: 2026-02-23 08:35:54.93439811 +0000 UTC m=+0.105309891 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.13)
Feb 23 08:35:54 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:35:56 np0005626463.localdomain sshd[91536]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:35:56 np0005626463.localdomain sshd[91536]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:35:58 np0005626463.localdomain systemd[1]: tmp-crun.lKJz9m.mount: Deactivated successfully.
Feb 23 08:35:58 np0005626463.localdomain podman[91539]: 2026-02-23 08:35:58.976106432 +0000 UTC m=+0.134667608 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:35:58 np0005626463.localdomain podman[91538]: 2026-02-23 08:35:58.938938315 +0000 UTC m=+0.100176707 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:35:59 np0005626463.localdomain podman[91540]: 2026-02-23 08:35:59.061828026 +0000 UTC m=+0.216710285 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:35:59 np0005626463.localdomain podman[91540]: 2026-02-23 08:35:59.090425348 +0000 UTC m=+0.245307667 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:35:59 np0005626463.localdomain podman[91538]: 2026-02-23 08:35:59.090816301 +0000 UTC m=+0.252054693 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=iscsid, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:35:59 np0005626463.localdomain podman[91543]: 2026-02-23 08:35:58.964950706 +0000 UTC m=+0.112983537 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:35:59 np0005626463.localdomain podman[91543]: 2026-02-23 08:35:59.119731774 +0000 UTC m=+0.267764585 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Feb 23 08:35:59 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:35:59 np0005626463.localdomain podman[91539]: 2026-02-23 08:35:59.175469502 +0000 UTC m=+0.334030728 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510)
Feb 23 08:35:59 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:35:59 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:35:59 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:35:59 np0005626463.localdomain podman[91552]: 2026-02-23 08:35:59.041101915 +0000 UTC m=+0.186242564 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:35:59 np0005626463.localdomain podman[91552]: 2026-02-23 08:35:59.327504162 +0000 UTC m=+0.472644781 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:35:59 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:36:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:36:01 np0005626463.localdomain systemd[1]: tmp-crun.3mUafh.mount: Deactivated successfully.
Feb 23 08:36:01 np0005626463.localdomain podman[91653]: 2026-02-23 08:36:01.928492808 +0000 UTC m=+0.098567306 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=nova_migration_target, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Feb 23 08:36:02 np0005626463.localdomain podman[91653]: 2026-02-23 08:36:02.310372442 +0000 UTC m=+0.480446950 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:36:02 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:36:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:36:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:36:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:36:03 np0005626463.localdomain podman[91680]: 2026-02-23 08:36:03.922091274 +0000 UTC m=+0.091608114 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true)
Feb 23 08:36:03 np0005626463.localdomain systemd[1]: tmp-crun.bkVTTI.mount: Deactivated successfully.
Feb 23 08:36:03 np0005626463.localdomain podman[91679]: 2026-02-23 08:36:03.982945556 +0000 UTC m=+0.157721413 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 23 08:36:04 np0005626463.localdomain podman[91678]: 2026-02-23 08:36:04.029572164 +0000 UTC m=+0.205473908 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510)
Feb 23 08:36:04 np0005626463.localdomain podman[91678]: 2026-02-23 08:36:04.056170182 +0000 UTC m=+0.232071896 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:36:04 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:36:04 np0005626463.localdomain podman[91679]: 2026-02-23 08:36:04.081524051 +0000 UTC m=+0.256299948 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:36:04 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:36:04 np0005626463.localdomain podman[91680]: 2026-02-23 08:36:04.129480361 +0000 UTC m=+0.298997241 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git)
Feb 23 08:36:04 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:36:04 np0005626463.localdomain sshd[91755]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:36:04 np0005626463.localdomain sshd[91755]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:36:19 np0005626463.localdomain sudo[91757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:36:19 np0005626463.localdomain sudo[91757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:36:19 np0005626463.localdomain sudo[91757]: pam_unix(sudo:session): session closed for user root
Feb 23 08:36:20 np0005626463.localdomain sudo[91772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:36:20 np0005626463.localdomain sudo[91772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:36:20 np0005626463.localdomain sudo[91772]: pam_unix(sudo:session): session closed for user root
Feb 23 08:36:22 np0005626463.localdomain sudo[91820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:36:22 np0005626463.localdomain sudo[91820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:36:22 np0005626463.localdomain sudo[91820]: pam_unix(sudo:session): session closed for user root
Feb 23 08:36:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:36:25 np0005626463.localdomain systemd[1]: tmp-crun.PwhsjV.mount: Deactivated successfully.
Feb 23 08:36:25 np0005626463.localdomain podman[91835]: 2026-02-23 08:36:25.926086216 +0000 UTC m=+0.096081516 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:36:25 np0005626463.localdomain podman[91835]: 2026-02-23 08:36:25.941363473 +0000 UTC m=+0.111358783 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-collectd-container)
Feb 23 08:36:25 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:36:29 np0005626463.localdomain podman[91856]: 2026-02-23 08:36:29.922253476 +0000 UTC m=+0.091255412 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 23 08:36:29 np0005626463.localdomain podman[91856]: 2026-02-23 08:36:29.935080995 +0000 UTC m=+0.104082901 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true)
Feb 23 08:36:29 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:36:29 np0005626463.localdomain podman[91865]: 2026-02-23 08:36:29.985444062 +0000 UTC m=+0.138099307 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:36:29 np0005626463.localdomain podman[91864]: 2026-02-23 08:36:29.938091731 +0000 UTC m=+0.092152801 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:36:30 np0005626463.localdomain podman[91865]: 2026-02-23 08:36:30.037195013 +0000 UTC m=+0.189850278 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:36:30 np0005626463.localdomain podman[91864]: 2026-02-23 08:36:30.071932752 +0000 UTC m=+0.225993812 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Feb 23 08:36:30 np0005626463.localdomain podman[91857]: 2026-02-23 08:36:30.080506075 +0000 UTC m=+0.243805099 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64)
Feb 23 08:36:30 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:36:30 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:36:30 np0005626463.localdomain podman[91858]: 2026-02-23 08:36:30.053473512 +0000 UTC m=+0.212725738 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:36:30 np0005626463.localdomain podman[91858]: 2026-02-23 08:36:30.132772132 +0000 UTC m=+0.292024388 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64)
Feb 23 08:36:30 np0005626463.localdomain podman[91857]: 2026-02-23 08:36:30.140308003 +0000 UTC m=+0.303607007 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:36:30 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:36:30 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:36:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:36:32 np0005626463.localdomain podman[91970]: 2026-02-23 08:36:32.924944116 +0000 UTC m=+0.091268962 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:36:33 np0005626463.localdomain podman[91970]: 2026-02-23 08:36:33.336473056 +0000 UTC m=+0.502797912 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com)
Feb 23 08:36:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: tmp-crun.3nluua.mount: Deactivated successfully.
Feb 23 08:36:34 np0005626463.localdomain podman[91993]: 2026-02-23 08:36:34.933162788 +0000 UTC m=+0.109020009 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: tmp-crun.h1XlOs.mount: Deactivated successfully.
Feb 23 08:36:34 np0005626463.localdomain podman[91993]: 2026-02-23 08:36:34.982051049 +0000 UTC m=+0.157908260 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:36:34 np0005626463.localdomain podman[91995]: 2026-02-23 08:36:34.992580454 +0000 UTC m=+0.158823238 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:36:34 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:36:35 np0005626463.localdomain podman[91994]: 2026-02-23 08:36:35.031169095 +0000 UTC m=+0.202800391 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:36:35 np0005626463.localdomain podman[91994]: 2026-02-23 08:36:35.108642327 +0000 UTC m=+0.280273603 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1766032510)
Feb 23 08:36:35 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:36:35 np0005626463.localdomain podman[91995]: 2026-02-23 08:36:35.23534935 +0000 UTC m=+0.401592074 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:36:35 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:36:42 np0005626463.localdomain sshd[92065]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:36:42 np0005626463.localdomain sshd[92066]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:36:43 np0005626463.localdomain sshd[92065]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:36:44 np0005626463.localdomain sshd[92066]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:36:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:36:56 np0005626463.localdomain podman[92069]: 2026-02-23 08:36:56.926712389 +0000 UTC m=+0.096744168 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:36:56 np0005626463.localdomain podman[92069]: 2026-02-23 08:36:56.938302548 +0000 UTC m=+0.108334277 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:36:56 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:37:00 np0005626463.localdomain systemd[1]: tmp-crun.v4zxiY.mount: Deactivated successfully.
Feb 23 08:37:00 np0005626463.localdomain podman[92088]: 2026-02-23 08:37:00.923692047 +0000 UTC m=+0.089908226 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:37:00 np0005626463.localdomain podman[92090]: 2026-02-23 08:37:00.993964243 +0000 UTC m=+0.154400343 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-cron-container)
Feb 23 08:37:01 np0005626463.localdomain podman[92090]: 2026-02-23 08:37:01.009291298 +0000 UTC m=+0.169727438 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:37:01 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:37:01 np0005626463.localdomain podman[92089]: 2026-02-23 08:37:00.969767733 +0000 UTC m=+0.131538805 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, release=1766032510, vcs-type=git)
Feb 23 08:37:01 np0005626463.localdomain podman[92089]: 2026-02-23 08:37:01.049586206 +0000 UTC m=+0.211357308 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64)
Feb 23 08:37:01 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:37:01 np0005626463.localdomain podman[92093]: 2026-02-23 08:37:01.102616669 +0000 UTC m=+0.256169866 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=nova_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z)
Feb 23 08:37:01 np0005626463.localdomain podman[92088]: 2026-02-23 08:37:01.107108147 +0000 UTC m=+0.273324416 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:37:01 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:37:01 np0005626463.localdomain podman[92087]: 2026-02-23 08:37:01.201183591 +0000 UTC m=+0.370090843 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=)
Feb 23 08:37:01 np0005626463.localdomain podman[92087]: 2026-02-23 08:37:01.213358218 +0000 UTC m=+0.382265460 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, version=17.1.13, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Feb 23 08:37:01 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:37:01 np0005626463.localdomain podman[92093]: 2026-02-23 08:37:01.227194917 +0000 UTC m=+0.380748124 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510)
Feb 23 08:37:01 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:37:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:37:03 np0005626463.localdomain podman[92199]: 2026-02-23 08:37:03.906690479 +0000 UTC m=+0.080228416 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:37:04 np0005626463.localdomain podman[92199]: 2026-02-23 08:37:04.328858354 +0000 UTC m=+0.502396241 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5)
Feb 23 08:37:04 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:37:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:37:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:37:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:37:05 np0005626463.localdomain podman[92222]: 2026-02-23 08:37:05.911382119 +0000 UTC m=+0.085677954 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:37:05 np0005626463.localdomain systemd[1]: tmp-crun.JgkojT.mount: Deactivated successfully.
Feb 23 08:37:05 np0005626463.localdomain podman[92223]: 2026-02-23 08:37:05.976528386 +0000 UTC m=+0.144757093 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:37:06 np0005626463.localdomain podman[92224]: 2026-02-23 08:37:06.022598414 +0000 UTC m=+0.189395208 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, release=1766032510, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Feb 23 08:37:06 np0005626463.localdomain podman[92222]: 2026-02-23 08:37:06.044378219 +0000 UTC m=+0.218674044 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true)
Feb 23 08:37:06 np0005626463.localdomain podman[92223]: 2026-02-23 08:37:06.053440309 +0000 UTC m=+0.221668976 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:37:06 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:37:06 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:37:06 np0005626463.localdomain podman[92224]: 2026-02-23 08:37:06.318432086 +0000 UTC m=+0.485228880 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:37:06 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:37:21 np0005626463.localdomain sshd[92300]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:37:22 np0005626463.localdomain sshd[92300]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:37:22 np0005626463.localdomain sudo[92302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:37:22 np0005626463.localdomain sudo[92302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:37:22 np0005626463.localdomain sudo[92302]: pam_unix(sudo:session): session closed for user root
Feb 23 08:37:22 np0005626463.localdomain sudo[92317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:37:22 np0005626463.localdomain sudo[92317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:37:23 np0005626463.localdomain sudo[92317]: pam_unix(sudo:session): session closed for user root
Feb 23 08:37:27 np0005626463.localdomain sudo[92364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:37:27 np0005626463.localdomain sudo[92364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:37:27 np0005626463.localdomain sudo[92364]: pam_unix(sudo:session): session closed for user root
Feb 23 08:37:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:37:27 np0005626463.localdomain systemd[1]: tmp-crun.D0wkt6.mount: Deactivated successfully.
Feb 23 08:37:27 np0005626463.localdomain podman[92379]: 2026-02-23 08:37:27.647226039 +0000 UTC m=+0.093835687 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:37:27 np0005626463.localdomain podman[92379]: 2026-02-23 08:37:27.660916253 +0000 UTC m=+0.107525931 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:37:27 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: tmp-crun.befOUE.mount: Deactivated successfully.
Feb 23 08:37:31 np0005626463.localdomain podman[92401]: 2026-02-23 08:37:31.933317871 +0000 UTC m=+0.101002799 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:37:31 np0005626463.localdomain podman[92401]: 2026-02-23 08:37:31.967253582 +0000 UTC m=+0.134938510 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:37:31 np0005626463.localdomain podman[92400]: 2026-02-23 08:37:31.982255008 +0000 UTC m=+0.149493902 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, distribution-scope=public)
Feb 23 08:37:31 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:37:32 np0005626463.localdomain podman[92402]: 2026-02-23 08:37:32.025056573 +0000 UTC m=+0.188581892 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:37:32 np0005626463.localdomain podman[92400]: 2026-02-23 08:37:32.04855409 +0000 UTC m=+0.215792994 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:37:32 np0005626463.localdomain podman[92403]: 2026-02-23 08:37:32.084823534 +0000 UTC m=+0.245721102 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team)
Feb 23 08:37:32 np0005626463.localdomain podman[92402]: 2026-02-23 08:37:32.112677707 +0000 UTC m=+0.276203006 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:37:32 np0005626463.localdomain podman[92403]: 2026-02-23 08:37:32.121308234 +0000 UTC m=+0.282205782 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container)
Feb 23 08:37:32 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:37:32 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:37:32 np0005626463.localdomain podman[92404]: 2026-02-23 08:37:32.134526454 +0000 UTC m=+0.292230932 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=)
Feb 23 08:37:32 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:37:32 np0005626463.localdomain podman[92404]: 2026-02-23 08:37:32.169404003 +0000 UTC m=+0.327108491 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true)
Feb 23 08:37:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:37:32 np0005626463.localdomain systemd[1]: tmp-crun.DOzmyK.mount: Deactivated successfully.
Feb 23 08:37:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:37:34 np0005626463.localdomain systemd[1]: tmp-crun.D3LYGv.mount: Deactivated successfully.
Feb 23 08:37:34 np0005626463.localdomain podman[92516]: 2026-02-23 08:37:34.920602415 +0000 UTC m=+0.093777115 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:37:35 np0005626463.localdomain podman[92516]: 2026-02-23 08:37:35.333403611 +0000 UTC m=+0.506578381 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Feb 23 08:37:35 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:37:35 np0005626463.localdomain sshd[92538]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:37:36 np0005626463.localdomain sshd[92538]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: tmp-crun.St56Ni.mount: Deactivated successfully.
Feb 23 08:37:36 np0005626463.localdomain podman[92540]: 2026-02-23 08:37:36.422401451 +0000 UTC m=+0.085236762 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4)
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:37:36 np0005626463.localdomain podman[92541]: 2026-02-23 08:37:36.443738841 +0000 UTC m=+0.100132242 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 23 08:37:36 np0005626463.localdomain podman[92540]: 2026-02-23 08:37:36.453199545 +0000 UTC m=+0.116034806 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, container_name=ovn_controller, distribution-scope=public, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:37:36 np0005626463.localdomain podman[92541]: 2026-02-23 08:37:36.495223976 +0000 UTC m=+0.151617367 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:37:36 np0005626463.localdomain podman[92573]: 2026-02-23 08:37:36.519420585 +0000 UTC m=+0.074602911 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 23 08:37:36 np0005626463.localdomain podman[92573]: 2026-02-23 08:37:36.741628718 +0000 UTC m=+0.296811024 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:37:36 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:37:47 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:37:47 np0005626463.localdomain recover_tripleo_nova_virtqemud[92618]: 61982
Feb 23 08:37:47 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:37:47 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:37:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:37:57 np0005626463.localdomain podman[92619]: 2026-02-23 08:37:57.907682338 +0000 UTC m=+0.084040364 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, version=17.1.13, container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:37:57 np0005626463.localdomain podman[92619]: 2026-02-23 08:37:57.948371768 +0000 UTC m=+0.124729764 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:37:57 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:38:01 np0005626463.localdomain sshd[92640]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:38:01 np0005626463.localdomain sshd[92640]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: tmp-crun.Dt9uLk.mount: Deactivated successfully.
Feb 23 08:38:02 np0005626463.localdomain podman[92642]: 2026-02-23 08:38:02.924706708 +0000 UTC m=+0.096015464 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z)
Feb 23 08:38:02 np0005626463.localdomain podman[92642]: 2026-02-23 08:38:02.96318153 +0000 UTC m=+0.134490286 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 23 08:38:02 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:38:02 np0005626463.localdomain podman[92643]: 2026-02-23 08:38:02.987771861 +0000 UTC m=+0.154151975 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:38:03 np0005626463.localdomain podman[92650]: 2026-02-23 08:38:03.031127724 +0000 UTC m=+0.191002256 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:38:03 np0005626463.localdomain podman[92643]: 2026-02-23 08:38:03.046381916 +0000 UTC m=+0.212762040 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 23 08:38:03 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:38:03 np0005626463.localdomain podman[92650]: 2026-02-23 08:38:03.070462893 +0000 UTC m=+0.230337415 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5)
Feb 23 08:38:03 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:38:03 np0005626463.localdomain podman[92644]: 2026-02-23 08:38:03.136666854 +0000 UTC m=+0.299671143 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:38:03 np0005626463.localdomain podman[92644]: 2026-02-23 08:38:03.195389842 +0000 UTC m=+0.358394121 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:38:03 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:38:03 np0005626463.localdomain podman[92645]: 2026-02-23 08:38:03.195934989 +0000 UTC m=+0.354959365 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, release=1766032510)
Feb 23 08:38:03 np0005626463.localdomain podman[92645]: 2026-02-23 08:38:03.283121979 +0000 UTC m=+0.442146345 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 23 08:38:03 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:38:03 np0005626463.localdomain systemd[1]: tmp-crun.x44AjN.mount: Deactivated successfully.
Feb 23 08:38:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:38:05 np0005626463.localdomain podman[92758]: 2026-02-23 08:38:05.912066774 +0000 UTC m=+0.083527367 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:38:06 np0005626463.localdomain podman[92758]: 2026-02-23 08:38:06.259350251 +0000 UTC m=+0.430810854 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target)
Feb 23 08:38:06 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:38:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:38:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:38:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:38:06 np0005626463.localdomain systemd[1]: tmp-crun.nsjypA.mount: Deactivated successfully.
Feb 23 08:38:06 np0005626463.localdomain podman[92783]: 2026-02-23 08:38:06.958985971 +0000 UTC m=+0.132007190 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, vcs-type=git)
Feb 23 08:38:07 np0005626463.localdomain podman[92784]: 2026-02-23 08:38:07.004358176 +0000 UTC m=+0.172522964 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:38:07 np0005626463.localdomain podman[92782]: 2026-02-23 08:38:07.046827311 +0000 UTC m=+0.223289246 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, release=1766032510, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:38:07 np0005626463.localdomain podman[92783]: 2026-02-23 08:38:07.070037511 +0000 UTC m=+0.243058720 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 23 08:38:07 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:38:07 np0005626463.localdomain podman[92782]: 2026-02-23 08:38:07.103399814 +0000 UTC m=+0.279861729 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 23 08:38:07 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:38:07 np0005626463.localdomain podman[92784]: 2026-02-23 08:38:07.241260104 +0000 UTC m=+0.409424862 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:38:07 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:38:21 np0005626463.localdomain sshd[92858]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:38:21 np0005626463.localdomain sshd[92858]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:38:27 np0005626463.localdomain sudo[92860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:38:27 np0005626463.localdomain sudo[92860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:38:27 np0005626463.localdomain sudo[92860]: pam_unix(sudo:session): session closed for user root
Feb 23 08:38:27 np0005626463.localdomain sudo[92875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:38:27 np0005626463.localdomain sudo[92875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:38:28 np0005626463.localdomain sudo[92875]: pam_unix(sudo:session): session closed for user root
Feb 23 08:38:28 np0005626463.localdomain sudo[92923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:38:28 np0005626463.localdomain sudo[92923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:38:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:38:28 np0005626463.localdomain sudo[92923]: pam_unix(sudo:session): session closed for user root
Feb 23 08:38:28 np0005626463.localdomain sudo[92939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 08:38:28 np0005626463.localdomain sudo[92939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:38:28 np0005626463.localdomain podman[92938]: 2026-02-23 08:38:28.920648255 +0000 UTC m=+0.090247837 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:38:28 np0005626463.localdomain podman[92938]: 2026-02-23 08:38:28.937104314 +0000 UTC m=+0.106703876 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, architecture=x86_64)
Feb 23 08:38:28 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:38:29 np0005626463.localdomain sudo[92939]: pam_unix(sudo:session): session closed for user root
Feb 23 08:38:32 np0005626463.localdomain sudo[92991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:38:32 np0005626463.localdomain sudo[92991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:38:32 np0005626463.localdomain sudo[92991]: pam_unix(sudo:session): session closed for user root
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: tmp-crun.i1mqyw.mount: Deactivated successfully.
Feb 23 08:38:33 np0005626463.localdomain podman[93006]: 2026-02-23 08:38:33.925029725 +0000 UTC m=+0.099333327 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 08:38:33 np0005626463.localdomain podman[93006]: 2026-02-23 08:38:33.969535523 +0000 UTC m=+0.143839085 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:38:33 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:38:34 np0005626463.localdomain podman[93009]: 2026-02-23 08:38:33.968926415 +0000 UTC m=+0.135091316 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, version=17.1.13, container_name=logrotate_crond, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.)
Feb 23 08:38:34 np0005626463.localdomain podman[93008]: 2026-02-23 08:38:34.016905671 +0000 UTC m=+0.184948030 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:38:34 np0005626463.localdomain podman[93007]: 2026-02-23 08:38:34.085097952 +0000 UTC m=+0.254494473 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5)
Feb 23 08:38:34 np0005626463.localdomain podman[93014]: 2026-02-23 08:38:34.130831599 +0000 UTC m=+0.293111189 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:38:34 np0005626463.localdomain podman[93007]: 2026-02-23 08:38:34.1405597 +0000 UTC m=+0.309956171 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:38:34 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:38:34 np0005626463.localdomain podman[93008]: 2026-02-23 08:38:34.155328028 +0000 UTC m=+0.323370367 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:38:34 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:38:34 np0005626463.localdomain podman[93014]: 2026-02-23 08:38:34.18900244 +0000 UTC m=+0.351281940 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible)
Feb 23 08:38:34 np0005626463.localdomain podman[93009]: 2026-02-23 08:38:34.210649261 +0000 UTC m=+0.376814192 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible)
Feb 23 08:38:34 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:38:34 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:38:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:38:36 np0005626463.localdomain podman[93119]: 2026-02-23 08:38:36.897581613 +0000 UTC m=+0.072035203 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible)
Feb 23 08:38:37 np0005626463.localdomain podman[93119]: 2026-02-23 08:38:37.261570246 +0000 UTC m=+0.436023826 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: tmp-crun.LrRK2s.mount: Deactivated successfully.
Feb 23 08:38:37 np0005626463.localdomain podman[93140]: 2026-02-23 08:38:37.372369979 +0000 UTC m=+0.082590190 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510)
Feb 23 08:38:37 np0005626463.localdomain podman[93142]: 2026-02-23 08:38:37.494989016 +0000 UTC m=+0.206399363 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 23 08:38:37 np0005626463.localdomain podman[93140]: 2026-02-23 08:38:37.50060525 +0000 UTC m=+0.210825521 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4)
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:38:37 np0005626463.localdomain podman[93141]: 2026-02-23 08:38:37.464630315 +0000 UTC m=+0.175993941 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:38:37 np0005626463.localdomain podman[93141]: 2026-02-23 08:38:37.545972876 +0000 UTC m=+0.257336492 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:38:37 np0005626463.localdomain podman[93142]: 2026-02-23 08:38:37.718441197 +0000 UTC m=+0.429851604 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:38:37 np0005626463.localdomain systemd[1]: tmp-crun.PyzsBH.mount: Deactivated successfully.
Feb 23 08:38:42 np0005626463.localdomain sshd[93218]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:38:43 np0005626463.localdomain sshd[93218]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:38:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:38:59 np0005626463.localdomain podman[93220]: 2026-02-23 08:38:59.906377551 +0000 UTC m=+0.073228095 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:38:59 np0005626463.localdomain podman[93220]: 2026-02-23 08:38:59.920201946 +0000 UTC m=+0.087052430 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5)
Feb 23 08:38:59 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:39:04 np0005626463.localdomain systemd[1]: tmp-crun.fe7jZL.mount: Deactivated successfully.
Feb 23 08:39:04 np0005626463.localdomain podman[93247]: 2026-02-23 08:39:04.951405104 +0000 UTC m=+0.113463100 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:39:04 np0005626463.localdomain podman[93241]: 2026-02-23 08:39:04.96902828 +0000 UTC m=+0.138666435 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:39:04 np0005626463.localdomain podman[93240]: 2026-02-23 08:39:04.981913494 +0000 UTC m=+0.156655970 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:39:04 np0005626463.localdomain podman[93240]: 2026-02-23 08:39:04.98906316 +0000 UTC m=+0.163805666 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5)
Feb 23 08:39:04 np0005626463.localdomain podman[93241]: 2026-02-23 08:39:04.999201128 +0000 UTC m=+0.168839303 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Feb 23 08:39:05 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:39:05 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:39:05 np0005626463.localdomain podman[93247]: 2026-02-23 08:39:05.087209898 +0000 UTC m=+0.249267844 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:39:05 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:39:05 np0005626463.localdomain podman[93251]: 2026-02-23 08:39:05.138424589 +0000 UTC m=+0.298209143 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:39:05 np0005626463.localdomain podman[93242]: 2026-02-23 08:39:05.179257164 +0000 UTC m=+0.346179883 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5)
Feb 23 08:39:05 np0005626463.localdomain podman[93251]: 2026-02-23 08:39:05.199299775 +0000 UTC m=+0.359084319 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 23 08:39:05 np0005626463.localdomain podman[93242]: 2026-02-23 08:39:05.209595678 +0000 UTC m=+0.376518397 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:39:05 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:39:05 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:39:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:39:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:39:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:39:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:39:07 np0005626463.localdomain systemd[1]: tmp-crun.UtyNHQ.mount: Deactivated successfully.
Feb 23 08:39:07 np0005626463.localdomain podman[93357]: 2026-02-23 08:39:07.932539221 +0000 UTC m=+0.106218932 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 23 08:39:08 np0005626463.localdomain podman[93360]: 2026-02-23 08:39:08.030029758 +0000 UTC m=+0.195709608 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 23 08:39:08 np0005626463.localdomain podman[93359]: 2026-02-23 08:39:07.9913142 +0000 UTC m=+0.158254810 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:39:08 np0005626463.localdomain podman[93359]: 2026-02-23 08:39:08.080361222 +0000 UTC m=+0.247301802 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13)
Feb 23 08:39:08 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:39:08 np0005626463.localdomain podman[93358]: 2026-02-23 08:39:08.132924066 +0000 UTC m=+0.303246332 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:39:08 np0005626463.localdomain podman[93358]: 2026-02-23 08:39:08.159180752 +0000 UTC m=+0.329503008 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, distribution-scope=public)
Feb 23 08:39:08 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:39:08 np0005626463.localdomain podman[93360]: 2026-02-23 08:39:08.230229348 +0000 UTC m=+0.395909208 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 23 08:39:08 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:39:08 np0005626463.localdomain podman[93357]: 2026-02-23 08:39:08.343371827 +0000 UTC m=+0.517051528 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 23 08:39:08 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:39:08 np0005626463.localdomain systemd[1]: tmp-crun.dxyc8V.mount: Deactivated successfully.
Feb 23 08:39:09 np0005626463.localdomain sshd[93452]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:13 np0005626463.localdomain sshd[93452]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:39:23 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=45778 SEQ=0 ACK=2193559953 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:39:25 np0005626463.localdomain sshd[93455]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:25 np0005626463.localdomain sshd[93457]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:25 np0005626463.localdomain sshd[93457]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:39:26 np0005626463.localdomain sshd[93455]: Connection closed by authenticating user root 116.255.155.36 port 50532 [preauth]
Feb 23 08:39:26 np0005626463.localdomain sshd[93459]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:27 np0005626463.localdomain sshd[93459]: Connection closed by authenticating user root 116.255.155.36 port 51828 [preauth]
Feb 23 08:39:27 np0005626463.localdomain sshd[93461]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:29 np0005626463.localdomain sshd[93461]: Connection closed by authenticating user root 116.255.155.36 port 52896 [preauth]
Feb 23 08:39:29 np0005626463.localdomain sshd[93463]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:39:30 np0005626463.localdomain sshd[93463]: Connection closed by authenticating user root 116.255.155.36 port 53968 [preauth]
Feb 23 08:39:30 np0005626463.localdomain systemd[1]: tmp-crun.hw13LJ.mount: Deactivated successfully.
Feb 23 08:39:30 np0005626463.localdomain podman[93465]: 2026-02-23 08:39:30.932077708 +0000 UTC m=+0.102041992 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:39:30 np0005626463.localdomain podman[93465]: 2026-02-23 08:39:30.947308307 +0000 UTC m=+0.117272671 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, release=1766032510, batch=17.1_20260112.1, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:39:30 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:39:31 np0005626463.localdomain sshd[93486]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:32 np0005626463.localdomain sshd[93486]: Connection closed by authenticating user root 116.255.155.36 port 55466 [preauth]
Feb 23 08:39:32 np0005626463.localdomain sshd[93488]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:33 np0005626463.localdomain sudo[93490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:39:33 np0005626463.localdomain sudo[93490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:39:33 np0005626463.localdomain sudo[93490]: pam_unix(sudo:session): session closed for user root
Feb 23 08:39:33 np0005626463.localdomain sudo[93505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:39:33 np0005626463.localdomain sudo[93505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:39:33 np0005626463.localdomain sudo[93505]: pam_unix(sudo:session): session closed for user root
Feb 23 08:39:33 np0005626463.localdomain sshd[93488]: Connection closed by authenticating user root 116.255.155.36 port 56644 [preauth]
Feb 23 08:39:34 np0005626463.localdomain sudo[93553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:39:34 np0005626463.localdomain sudo[93553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:39:34 np0005626463.localdomain sudo[93553]: pam_unix(sudo:session): session closed for user root
Feb 23 08:39:34 np0005626463.localdomain sshd[93568]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:34 np0005626463.localdomain sudo[93569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 08:39:34 np0005626463.localdomain sudo[93569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.873923742 +0000 UTC m=+0.080487053 container create 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, RELEASE=main, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347)
Feb 23 08:39:34 np0005626463.localdomain systemd[1]: Started libpod-conmon-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope.
Feb 23 08:39:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.840884243 +0000 UTC m=+0.047447614 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.951979298 +0000 UTC m=+0.158542619 container init 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.969399256 +0000 UTC m=+0.175962567 container start 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.969783928 +0000 UTC m=+0.176347259 container attach 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.buildah.version=1.42.2, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, CEPH_POINT_RELEASE=)
Feb 23 08:39:34 np0005626463.localdomain quirky_lumiere[93641]: 167 167
Feb 23 08:39:34 np0005626463.localdomain systemd[1]: libpod-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope: Deactivated successfully.
Feb 23 08:39:34 np0005626463.localdomain podman[93626]: 2026-02-23 08:39:34.974995282 +0000 UTC m=+0.181558603 container died 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:39:35 np0005626463.localdomain podman[93646]: 2026-02-23 08:39:35.084167088 +0000 UTC m=+0.097395566 container remove 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, distribution-scope=public, com.redhat.component=rhceph-container, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: libpod-conmon-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain podman[93660]: 2026-02-23 08:39:35.174227451 +0000 UTC m=+0.102572439 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:39:35 np0005626463.localdomain podman[93660]: 2026-02-23 08:39:35.23554063 +0000 UTC m=+0.163885538 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, release=1766032510, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, tcib_managed=true)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 
Feb 23 08:39:35 np0005626463.localdomain podman[93706]: 2026-02-23 08:39:35.327944178 +0000 UTC m=+0.089930831 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 2026-02-23 08:39:35.338722316 +0000 UTC m=+0.087274477 container create d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, vcs-type=git)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started libpod-conmon-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope.
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 2026-02-23 08:39:35.305693207 +0000 UTC m=+0.054245408 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 08:39:35 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 08:39:35 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 08:39:35 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 2026-02-23 08:39:35.424819535 +0000 UTC m=+0.173371676 container init d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7)
Feb 23 08:39:35 np0005626463.localdomain podman[93733]: 2026-02-23 08:39:35.427309993 +0000 UTC m=+0.153037596 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z)
Feb 23 08:39:35 np0005626463.localdomain sshd[93568]: Connection closed by authenticating user root 116.255.155.36 port 57698 [preauth]
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 2026-02-23 08:39:35.437120103 +0000 UTC m=+0.185672244 container start d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64)
Feb 23 08:39:35 np0005626463.localdomain podman[93716]: 2026-02-23 08:39:35.437973119 +0000 UTC m=+0.186525300 container attach d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, distribution-scope=public)
Feb 23 08:39:35 np0005626463.localdomain podman[93659]: 2026-02-23 08:39:35.237081828 +0000 UTC m=+0.166263082 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, release=1766032510, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z)
Feb 23 08:39:35 np0005626463.localdomain podman[93706]: 2026-02-23 08:39:35.443975798 +0000 UTC m=+0.205962461 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, url=https://www.redhat.com)
Feb 23 08:39:35 np0005626463.localdomain podman[93733]: 2026-02-23 08:39:35.456448351 +0000 UTC m=+0.182175954 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain podman[93659]: 2026-02-23 08:39:35.478407861 +0000 UTC m=+0.407589125 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain podman[93703]: 2026-02-23 08:39:35.396662869 +0000 UTC m=+0.163009880 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5)
Feb 23 08:39:35 np0005626463.localdomain podman[93703]: 2026-02-23 08:39:35.582414363 +0000 UTC m=+0.348761324 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, release=1766032510, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:39:35 np0005626463.localdomain sshd[93806]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-387aafea2448999ae2405fb00529bd6bf8cf6a12bf38716a9e8a06259075d983-merged.mount: Deactivated successfully.
Feb 23 08:39:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=58188 SEQ=0 ACK=2144845811 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]: [
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:     {
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "available": false,
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "ceph_device": false,
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "lsm_data": {},
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "lvs": [],
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "path": "/dev/sr0",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "rejected_reasons": [
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "Insufficient space (<5GB)",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "Has a FileSystem"
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         ],
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         "sys_api": {
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "actuators": null,
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "device_nodes": "sr0",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "human_readable_size": "482.00 KB",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "id_bus": "ata",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "model": "QEMU DVD-ROM",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "nr_requests": "2",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "partitions": {},
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "path": "/dev/sr0",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "removable": "1",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "rev": "2.5+",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "ro": "0",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "rotational": "1",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "sas_address": "",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "sas_device_handle": "",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "scheduler_mode": "mq-deadline",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "sectors": 0,
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "sectorsize": "2048",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "size": 493568.0,
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "support_discard": "0",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "type": "disk",
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:             "vendor": "QEMU"
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:         }
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]:     }
Feb 23 08:39:36 np0005626463.localdomain romantic_mendel[93779]: ]
Feb 23 08:39:36 np0005626463.localdomain systemd[1]: libpod-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Deactivated successfully.
Feb 23 08:39:36 np0005626463.localdomain systemd[1]: libpod-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Consumed 1.122s CPU time.
Feb 23 08:39:36 np0005626463.localdomain podman[95659]: 2026-02-23 08:39:36.620857877 +0000 UTC m=+0.056203230 container died d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, ceph=True, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 08:39:36 np0005626463.localdomain systemd[1]: tmp-crun.q7i8qj.mount: Deactivated successfully.
Feb 23 08:39:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f-merged.mount: Deactivated successfully.
Feb 23 08:39:36 np0005626463.localdomain podman[95659]: 2026-02-23 08:39:36.665169281 +0000 UTC m=+0.100514594 container remove d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 08:39:36 np0005626463.localdomain systemd[1]: libpod-conmon-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Deactivated successfully.
Feb 23 08:39:36 np0005626463.localdomain sudo[93569]: pam_unix(sudo:session): session closed for user root
Feb 23 08:39:36 np0005626463.localdomain sshd[93806]: Connection closed by authenticating user root 116.255.155.36 port 58964 [preauth]
Feb 23 08:39:37 np0005626463.localdomain sshd[95674]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:38 np0005626463.localdomain sshd[95674]: Connection closed by authenticating user root 116.255.155.36 port 60100 [preauth]
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: tmp-crun.obe6u4.mount: Deactivated successfully.
Feb 23 08:39:38 np0005626463.localdomain podman[95676]: 2026-02-23 08:39:38.476996798 +0000 UTC m=+0.150478846 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: tmp-crun.VsUxHm.mount: Deactivated successfully.
Feb 23 08:39:38 np0005626463.localdomain podman[95676]: 2026-02-23 08:39:38.539220825 +0000 UTC m=+0.212702853 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:39:38 np0005626463.localdomain podman[95677]: 2026-02-23 08:39:38.523351496 +0000 UTC m=+0.196246326 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 23 08:39:38 np0005626463.localdomain sshd[95774]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:38 np0005626463.localdomain podman[95678]: 2026-02-23 08:39:38.440544311 +0000 UTC m=+0.112016416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, vcs-type=git)
Feb 23 08:39:38 np0005626463.localdomain podman[95717]: 2026-02-23 08:39:38.541077254 +0000 UTC m=+0.094250747 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Feb 23 08:39:38 np0005626463.localdomain podman[95677]: 2026-02-23 08:39:38.603783477 +0000 UTC m=+0.276678307 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.)
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:39:38 np0005626463.localdomain podman[95678]: 2026-02-23 08:39:38.702320337 +0000 UTC m=+0.373792372 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:39:38 np0005626463.localdomain podman[95717]: 2026-02-23 08:39:38.954985397 +0000 UTC m=+0.508158940 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:39:38 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:39:39 np0005626463.localdomain sudo[95777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:39:39 np0005626463.localdomain sudo[95777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:39:39 np0005626463.localdomain sudo[95777]: pam_unix(sudo:session): session closed for user root
Feb 23 08:39:39 np0005626463.localdomain sshd[95774]: Connection closed by authenticating user root 116.255.155.36 port 32886 [preauth]
Feb 23 08:39:40 np0005626463.localdomain sshd[95792]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:41 np0005626463.localdomain sshd[95792]: Connection closed by authenticating user root 116.255.155.36 port 34198 [preauth]
Feb 23 08:39:41 np0005626463.localdomain sshd[95794]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:42 np0005626463.localdomain sshd[95794]: Connection closed by authenticating user root 116.255.155.36 port 35302 [preauth]
Feb 23 08:39:42 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:39:42 np0005626463.localdomain recover_tripleo_nova_virtqemud[95798]: 61982
Feb 23 08:39:42 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:39:42 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:39:43 np0005626463.localdomain sshd[95799]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:44 np0005626463.localdomain sshd[95799]: Connection closed by authenticating user root 116.255.155.36 port 36482 [preauth]
Feb 23 08:39:44 np0005626463.localdomain sshd[95801]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:46 np0005626463.localdomain sshd[95801]: Connection closed by authenticating user root 116.255.155.36 port 37878 [preauth]
Feb 23 08:39:46 np0005626463.localdomain sshd[95803]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:47 np0005626463.localdomain sshd[95803]: Connection closed by authenticating user root 116.255.155.36 port 39066 [preauth]
Feb 23 08:39:47 np0005626463.localdomain sshd[95805]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:49 np0005626463.localdomain sshd[95805]: Connection closed by authenticating user root 116.255.155.36 port 40334 [preauth]
Feb 23 08:39:49 np0005626463.localdomain sshd[95807]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:51 np0005626463.localdomain sshd[95807]: Connection closed by authenticating user root 116.255.155.36 port 41634 [preauth]
Feb 23 08:39:51 np0005626463.localdomain sshd[95809]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:52 np0005626463.localdomain sshd[95809]: Connection closed by authenticating user root 116.255.155.36 port 43164 [preauth]
Feb 23 08:39:52 np0005626463.localdomain sshd[95811]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:54 np0005626463.localdomain sshd[95811]: Connection closed by authenticating user root 116.255.155.36 port 44340 [preauth]
Feb 23 08:39:54 np0005626463.localdomain sshd[95813]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:55 np0005626463.localdomain sshd[95813]: Connection closed by authenticating user root 116.255.155.36 port 45398 [preauth]
Feb 23 08:39:56 np0005626463.localdomain sshd[95815]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:57 np0005626463.localdomain sshd[95815]: Connection closed by authenticating user root 116.255.155.36 port 46766 [preauth]
Feb 23 08:39:57 np0005626463.localdomain sshd[95817]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:58 np0005626463.localdomain sshd[95817]: Connection closed by authenticating user root 116.255.155.36 port 47826 [preauth]
Feb 23 08:39:58 np0005626463.localdomain sshd[95819]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:39:59 np0005626463.localdomain sshd[95819]: Connection closed by authenticating user root 116.255.155.36 port 48872 [preauth]
Feb 23 08:39:59 np0005626463.localdomain sshd[95821]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:00 np0005626463.localdomain sshd[95822]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:01 np0005626463.localdomain sshd[95821]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:40:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:40:01 np0005626463.localdomain systemd[1]: tmp-crun.chWVWC.mount: Deactivated successfully.
Feb 23 08:40:01 np0005626463.localdomain podman[95825]: 2026-02-23 08:40:01.287724095 +0000 UTC m=+0.092225633 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:40:01 np0005626463.localdomain podman[95825]: 2026-02-23 08:40:01.303288925 +0000 UTC m=+0.107790513 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:40:01 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:40:01 np0005626463.localdomain sshd[95822]: Connection closed by authenticating user root 116.255.155.36 port 49954 [preauth]
Feb 23 08:40:02 np0005626463.localdomain sshd[95846]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:04 np0005626463.localdomain sshd[95846]: Connection closed by authenticating user root 116.255.155.36 port 51162 [preauth]
Feb 23 08:40:04 np0005626463.localdomain sshd[95848]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:05 np0005626463.localdomain sshd[95848]: Connection closed by authenticating user root 116.255.155.36 port 53256 [preauth]
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: tmp-crun.7hkJgD.mount: Deactivated successfully.
Feb 23 08:40:05 np0005626463.localdomain podman[95850]: 2026-02-23 08:40:05.831713384 +0000 UTC m=+0.102107243 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: tmp-crun.kRWCa3.mount: Deactivated successfully.
Feb 23 08:40:05 np0005626463.localdomain podman[95850]: 2026-02-23 08:40:05.869318078 +0000 UTC m=+0.139711957 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:40:05 np0005626463.localdomain podman[95852]: 2026-02-23 08:40:05.927331204 +0000 UTC m=+0.191310471 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:40:05 np0005626463.localdomain podman[95851]: 2026-02-23 08:40:05.893807049 +0000 UTC m=+0.162749063 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git)
Feb 23 08:40:05 np0005626463.localdomain podman[95857]: 2026-02-23 08:40:05.846900163 +0000 UTC m=+0.103402685 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 23 08:40:05 np0005626463.localdomain sshd[95958]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:05 np0005626463.localdomain podman[95851]: 2026-02-23 08:40:05.978305158 +0000 UTC m=+0.247247132 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13)
Feb 23 08:40:05 np0005626463.localdomain podman[95857]: 2026-02-23 08:40:05.98631401 +0000 UTC m=+0.242816562 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute)
Feb 23 08:40:05 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:40:06 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:40:06 np0005626463.localdomain podman[95853]: 2026-02-23 08:40:06.039844924 +0000 UTC m=+0.301372654 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond)
Feb 23 08:40:06 np0005626463.localdomain podman[95852]: 2026-02-23 08:40:06.048190236 +0000 UTC m=+0.312169523 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public)
Feb 23 08:40:06 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:40:06 np0005626463.localdomain podman[95853]: 2026-02-23 08:40:06.073282266 +0000 UTC m=+0.334810026 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:40:06 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:40:07 np0005626463.localdomain sshd[95958]: Connection closed by authenticating user root 116.255.155.36 port 54438 [preauth]
Feb 23 08:40:07 np0005626463.localdomain sshd[95973]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:07 np0005626463.localdomain sshd[95975]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:08 np0005626463.localdomain sshd[95975]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:40:08 np0005626463.localdomain sshd[95973]: Connection closed by authenticating user root 116.255.155.36 port 55700 [preauth]
Feb 23 08:40:08 np0005626463.localdomain sshd[95977]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:40:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:40:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:40:08 np0005626463.localdomain systemd[1]: tmp-crun.Z2edxv.mount: Deactivated successfully.
Feb 23 08:40:08 np0005626463.localdomain podman[95980]: 2026-02-23 08:40:08.929625716 +0000 UTC m=+0.096053793 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, url=https://www.redhat.com, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z)
Feb 23 08:40:08 np0005626463.localdomain podman[95980]: 2026-02-23 08:40:08.991486162 +0000 UTC m=+0.157914279 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z)
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:40:09 np0005626463.localdomain podman[95978]: 2026-02-23 08:40:09.008045503 +0000 UTC m=+0.176635678 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git)
Feb 23 08:40:09 np0005626463.localdomain podman[95978]: 2026-02-23 08:40:09.032002498 +0000 UTC m=+0.200592683 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true)
Feb 23 08:40:09 np0005626463.localdomain podman[95978]: unhealthy
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:40:09 np0005626463.localdomain podman[95981]: 2026-02-23 08:40:09.083051394 +0000 UTC m=+0.245993861 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:40:09 np0005626463.localdomain podman[96032]: 2026-02-23 08:40:09.162194464 +0000 UTC m=+0.144713434 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 23 08:40:09 np0005626463.localdomain podman[95981]: 2026-02-23 08:40:09.262411327 +0000 UTC m=+0.425353594 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:40:09 np0005626463.localdomain podman[96032]: 2026-02-23 08:40:09.550256714 +0000 UTC m=+0.532775724 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:40:09 np0005626463.localdomain systemd[1]: tmp-crun.U8p10R.mount: Deactivated successfully.
Feb 23 08:40:10 np0005626463.localdomain sshd[95977]: Connection closed by authenticating user root 116.255.155.36 port 56664 [preauth]
Feb 23 08:40:10 np0005626463.localdomain sshd[96082]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:12 np0005626463.localdomain sshd[96082]: Connection closed by authenticating user root 116.255.155.36 port 58070 [preauth]
Feb 23 08:40:12 np0005626463.localdomain sshd[96084]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:13 np0005626463.localdomain sshd[96084]: Connection closed by authenticating user root 116.255.155.36 port 59436 [preauth]
Feb 23 08:40:13 np0005626463.localdomain sshd[96086]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:14 np0005626463.localdomain sshd[96086]: Connection closed by authenticating user root 116.255.155.36 port 60504 [preauth]
Feb 23 08:40:15 np0005626463.localdomain sshd[96088]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:16 np0005626463.localdomain sshd[96088]: Connection closed by authenticating user root 116.255.155.36 port 33350 [preauth]
Feb 23 08:40:16 np0005626463.localdomain sshd[96090]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:18 np0005626463.localdomain sshd[96090]: Connection closed by authenticating user root 116.255.155.36 port 34600 [preauth]
Feb 23 08:40:18 np0005626463.localdomain sshd[96092]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:19 np0005626463.localdomain sshd[96092]: Connection closed by authenticating user root 116.255.155.36 port 35846 [preauth]
Feb 23 08:40:20 np0005626463.localdomain sshd[96094]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:22 np0005626463.localdomain sshd[96094]: Connection closed by authenticating user root 116.255.155.36 port 36904 [preauth]
Feb 23 08:40:22 np0005626463.localdomain sshd[96096]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:23 np0005626463.localdomain sshd[96096]: Connection closed by authenticating user root 116.255.155.36 port 38884 [preauth]
Feb 23 08:40:24 np0005626463.localdomain sshd[96098]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:25 np0005626463.localdomain sshd[96098]: Connection closed by authenticating user root 116.255.155.36 port 40178 [preauth]
Feb 23 08:40:25 np0005626463.localdomain sshd[96100]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:26 np0005626463.localdomain sshd[96100]: Connection closed by authenticating user root 116.255.155.36 port 41334 [preauth]
Feb 23 08:40:27 np0005626463.localdomain sshd[96102]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:28 np0005626463.localdomain sshd[96102]: Connection closed by authenticating user root 116.255.155.36 port 42578 [preauth]
Feb 23 08:40:28 np0005626463.localdomain sshd[96104]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:29 np0005626463.localdomain sshd[96104]: Connection closed by authenticating user root 116.255.155.36 port 43730 [preauth]
Feb 23 08:40:29 np0005626463.localdomain sshd[96106]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:31 np0005626463.localdomain sshd[96106]: Connection closed by authenticating user root 116.255.155.36 port 44846 [preauth]
Feb 23 08:40:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:40:31 np0005626463.localdomain podman[96108]: 2026-02-23 08:40:31.573298746 +0000 UTC m=+0.086187643 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:40:31 np0005626463.localdomain podman[96108]: 2026-02-23 08:40:31.583774216 +0000 UTC m=+0.096663123 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, version=17.1.13, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:40:31 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:40:31 np0005626463.localdomain sshd[96129]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:33 np0005626463.localdomain sshd[96129]: Connection closed by authenticating user root 116.255.155.36 port 46284 [preauth]
Feb 23 08:40:33 np0005626463.localdomain sshd[96131]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:34 np0005626463.localdomain sshd[96131]: Connection closed by authenticating user root 116.255.155.36 port 47706 [preauth]
Feb 23 08:40:34 np0005626463.localdomain sshd[96133]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:36 np0005626463.localdomain sshd[96133]: Connection closed by authenticating user root 116.255.155.36 port 48810 [preauth]
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:40:36 np0005626463.localdomain podman[96136]: 2026-02-23 08:40:36.433889357 +0000 UTC m=+0.093355108 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Feb 23 08:40:36 np0005626463.localdomain podman[96138]: 2026-02-23 08:40:36.497248181 +0000 UTC m=+0.148896736 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z)
Feb 23 08:40:36 np0005626463.localdomain podman[96136]: 2026-02-23 08:40:36.524342853 +0000 UTC m=+0.183808674 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:40:36 np0005626463.localdomain podman[96135]: 2026-02-23 08:40:36.533477441 +0000 UTC m=+0.195154352 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git)
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:40:36 np0005626463.localdomain sshd[96233]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:36 np0005626463.localdomain podman[96135]: 2026-02-23 08:40:36.550401593 +0000 UTC m=+0.212078494 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3)
Feb 23 08:40:36 np0005626463.localdomain podman[96144]: 2026-02-23 08:40:36.459791973 +0000 UTC m=+0.106033358 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, container_name=nova_compute)
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:40:36 np0005626463.localdomain podman[96138]: 2026-02-23 08:40:36.580740498 +0000 UTC m=+0.232389093 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:40:36 np0005626463.localdomain podman[96144]: 2026-02-23 08:40:36.594555193 +0000 UTC m=+0.240796618 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, distribution-scope=public, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:40:36 np0005626463.localdomain podman[96137]: 2026-02-23 08:40:36.65106936 +0000 UTC m=+0.306436084 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi)
Feb 23 08:40:36 np0005626463.localdomain podman[96137]: 2026-02-23 08:40:36.711339786 +0000 UTC m=+0.366706479 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:40:36 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:40:37 np0005626463.localdomain sshd[96233]: Connection closed by authenticating user root 116.255.155.36 port 50066 [preauth]
Feb 23 08:40:37 np0005626463.localdomain sshd[96255]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:39 np0005626463.localdomain sshd[96255]: Connection closed by authenticating user root 116.255.155.36 port 51262 [preauth]
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:40:39 np0005626463.localdomain podman[96257]: 2026-02-23 08:40:39.298084875 +0000 UTC m=+0.084464368 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, vcs-type=git, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:40:39 np0005626463.localdomain podman[96258]: 2026-02-23 08:40:39.354797859 +0000 UTC m=+0.137738374 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public)
Feb 23 08:40:39 np0005626463.localdomain podman[96257]: 2026-02-23 08:40:39.360188419 +0000 UTC m=+0.146567872 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Feb 23 08:40:39 np0005626463.localdomain podman[96257]: unhealthy
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:40:39 np0005626463.localdomain podman[96258]: 2026-02-23 08:40:39.431281556 +0000 UTC m=+0.214222081 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:40:39 np0005626463.localdomain sshd[96320]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:40:39 np0005626463.localdomain podman[96288]: 2026-02-23 08:40:39.470692045 +0000 UTC m=+0.155478522 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 23 08:40:39 np0005626463.localdomain podman[96288]: 2026-02-23 08:40:39.672357081 +0000 UTC m=+0.357143588 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:40:39 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:40:39 np0005626463.localdomain sudo[96338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:40:39 np0005626463.localdomain sudo[96338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:40:39 np0005626463.localdomain sudo[96338]: pam_unix(sudo:session): session closed for user root
Feb 23 08:40:39 np0005626463.localdomain podman[96342]: 2026-02-23 08:40:39.76609181 +0000 UTC m=+0.068077443 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 23 08:40:39 np0005626463.localdomain sudo[96375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 08:40:39 np0005626463.localdomain sudo[96375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:40:40 np0005626463.localdomain podman[96342]: 2026-02-23 08:40:40.204276827 +0000 UTC m=+0.506262460 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 23 08:40:40 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:40:40 np0005626463.localdomain sudo[96375]: pam_unix(sudo:session): session closed for user root
Feb 23 08:40:40 np0005626463.localdomain systemd[1]: tmp-crun.qwGcsr.mount: Deactivated successfully.
Feb 23 08:40:40 np0005626463.localdomain sudo[96412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:40:40 np0005626463.localdomain sudo[96412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:40:40 np0005626463.localdomain sudo[96412]: pam_unix(sudo:session): session closed for user root
Feb 23 08:40:40 np0005626463.localdomain sudo[96427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:40:40 np0005626463.localdomain sudo[96427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:40:40 np0005626463.localdomain sshd[96320]: Connection closed by authenticating user root 116.255.155.36 port 52364 [preauth]
Feb 23 08:40:40 np0005626463.localdomain sshd[96457]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:41 np0005626463.localdomain sudo[96427]: pam_unix(sudo:session): session closed for user root
Feb 23 08:40:41 np0005626463.localdomain sudo[96476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:40:41 np0005626463.localdomain sudo[96476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:40:41 np0005626463.localdomain sudo[96476]: pam_unix(sudo:session): session closed for user root
Feb 23 08:40:42 np0005626463.localdomain sshd[96457]: Connection closed by authenticating user root 116.255.155.36 port 53640 [preauth]
Feb 23 08:40:42 np0005626463.localdomain sshd[96491]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:42 np0005626463.localdomain sshd[96493]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:43 np0005626463.localdomain sshd[96491]: Connection closed by authenticating user root 116.255.155.36 port 54838 [preauth]
Feb 23 08:40:43 np0005626463.localdomain sshd[96493]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:40:44 np0005626463.localdomain sshd[96495]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:45 np0005626463.localdomain sshd[96495]: Connection closed by authenticating user root 116.255.155.36 port 56010 [preauth]
Feb 23 08:40:45 np0005626463.localdomain sshd[96497]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:46 np0005626463.localdomain sshd[96497]: Connection closed by authenticating user root 116.255.155.36 port 57146 [preauth]
Feb 23 08:40:46 np0005626463.localdomain sshd[96499]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:48 np0005626463.localdomain sshd[96499]: Connection closed by authenticating user root 116.255.155.36 port 58292 [preauth]
Feb 23 08:40:48 np0005626463.localdomain sshd[96501]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:48 np0005626463.localdomain sshd[96502]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:48 np0005626463.localdomain sshd[96502]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:40:49 np0005626463.localdomain sshd[96501]: Connection closed by authenticating user root 116.255.155.36 port 59656 [preauth]
Feb 23 08:40:50 np0005626463.localdomain sshd[96505]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:51 np0005626463.localdomain sshd[96505]: Connection closed by authenticating user root 116.255.155.36 port 32900 [preauth]
Feb 23 08:40:51 np0005626463.localdomain sshd[96507]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:53 np0005626463.localdomain sshd[96507]: Connection closed by authenticating user root 116.255.155.36 port 34156 [preauth]
Feb 23 08:40:53 np0005626463.localdomain sshd[96509]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:54 np0005626463.localdomain sshd[96509]: Connection closed by authenticating user root 116.255.155.36 port 35436 [preauth]
Feb 23 08:40:54 np0005626463.localdomain sshd[96511]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:56 np0005626463.localdomain sshd[96511]: Connection closed by authenticating user root 116.255.155.36 port 36674 [preauth]
Feb 23 08:40:56 np0005626463.localdomain sshd[96513]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:57 np0005626463.localdomain sshd[96513]: Connection closed by authenticating user root 116.255.155.36 port 38028 [preauth]
Feb 23 08:40:58 np0005626463.localdomain sshd[96515]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:40:59 np0005626463.localdomain sshd[96515]: Connection closed by authenticating user root 116.255.155.36 port 39450 [preauth]
Feb 23 08:40:59 np0005626463.localdomain sshd[96517]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:00 np0005626463.localdomain sshd[96517]: Connection closed by authenticating user root 116.255.155.36 port 40760 [preauth]
Feb 23 08:41:00 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:41:00 np0005626463.localdomain recover_tripleo_nova_virtqemud[96520]: 61982
Feb 23 08:41:00 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:41:00 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:41:01 np0005626463.localdomain sshd[96521]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:41:01 np0005626463.localdomain systemd[1]: tmp-crun.79YQ1d.mount: Deactivated successfully.
Feb 23 08:41:01 np0005626463.localdomain podman[96523]: 2026-02-23 08:41:01.919446843 +0000 UTC m=+0.087725702 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible)
Feb 23 08:41:01 np0005626463.localdomain podman[96523]: 2026-02-23 08:41:01.930828181 +0000 UTC m=+0.099107020 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, container_name=collectd, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:41:01 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:41:02 np0005626463.localdomain sshd[96521]: Connection closed by authenticating user root 116.255.155.36 port 41852 [preauth]
Feb 23 08:41:02 np0005626463.localdomain sshd[96544]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:04 np0005626463.localdomain sshd[96544]: Connection closed by authenticating user root 116.255.155.36 port 43114 [preauth]
Feb 23 08:41:04 np0005626463.localdomain sshd[96546]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:05 np0005626463.localdomain sshd[96546]: Connection closed by authenticating user root 116.255.155.36 port 44352 [preauth]
Feb 23 08:41:05 np0005626463.localdomain sshd[96548]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:41:06 np0005626463.localdomain podman[96551]: 2026-02-23 08:41:06.935893764 +0000 UTC m=+0.104079493 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute)
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: tmp-crun.mEx6L6.mount: Deactivated successfully.
Feb 23 08:41:06 np0005626463.localdomain podman[96551]: 2026-02-23 08:41:06.983446952 +0000 UTC m=+0.151632711 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:41:06 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:41:07 np0005626463.localdomain podman[96550]: 2026-02-23 08:41:07.038644714 +0000 UTC m=+0.206812973 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, io.openshift.expose-services=, container_name=iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container)
Feb 23 08:41:07 np0005626463.localdomain podman[96553]: 2026-02-23 08:41:06.985410075 +0000 UTC m=+0.144642559 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:41:07 np0005626463.localdomain podman[96552]: 2026-02-23 08:41:07.086999838 +0000 UTC m=+0.249342171 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 23 08:41:07 np0005626463.localdomain podman[96550]: 2026-02-23 08:41:07.105079436 +0000 UTC m=+0.273247705 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:41:07 np0005626463.localdomain podman[96552]: 2026-02-23 08:41:07.116313034 +0000 UTC m=+0.278655317 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public)
Feb 23 08:41:07 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:41:07 np0005626463.localdomain sshd[96548]: Connection closed by authenticating user root 116.255.155.36 port 45594 [preauth]
Feb 23 08:41:07 np0005626463.localdomain podman[96553]: 2026-02-23 08:41:07.132368367 +0000 UTC m=+0.291600841 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=logrotate_crond, distribution-scope=public, release=1766032510)
Feb 23 08:41:07 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:41:07 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:41:07 np0005626463.localdomain podman[96564]: 2026-02-23 08:41:07.192766205 +0000 UTC m=+0.348723473 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute)
Feb 23 08:41:07 np0005626463.localdomain podman[96564]: 2026-02-23 08:41:07.253436322 +0000 UTC m=+0.409393570 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:32:04Z)
Feb 23 08:41:07 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:41:07 np0005626463.localdomain sshd[96671]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:07 np0005626463.localdomain systemd[1]: tmp-crun.E3fNVO.mount: Deactivated successfully.
Feb 23 08:41:08 np0005626463.localdomain sshd[96671]: Connection closed by authenticating user root 116.255.155.36 port 46768 [preauth]
Feb 23 08:41:09 np0005626463.localdomain sshd[96673]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:41:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:41:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:41:09 np0005626463.localdomain systemd[1]: tmp-crun.VsHCGf.mount: Deactivated successfully.
Feb 23 08:41:09 np0005626463.localdomain podman[96676]: 2026-02-23 08:41:09.930028689 +0000 UTC m=+0.100454118 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, architecture=x86_64, container_name=ovn_metadata_agent)
Feb 23 08:41:09 np0005626463.localdomain podman[96675]: 2026-02-23 08:41:09.976260186 +0000 UTC m=+0.146832770 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 23 08:41:10 np0005626463.localdomain podman[96675]: 2026-02-23 08:41:10.016914393 +0000 UTC m=+0.187486977 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true)
Feb 23 08:41:10 np0005626463.localdomain podman[96675]: unhealthy
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:41:10 np0005626463.localdomain podman[96677]: 2026-02-23 08:41:10.033831243 +0000 UTC m=+0.201437232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:41:10 np0005626463.localdomain podman[96676]: 2026-02-23 08:41:10.052162049 +0000 UTC m=+0.222587448 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 23 08:41:10 np0005626463.localdomain podman[96676]: unhealthy
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:41:10 np0005626463.localdomain podman[96677]: 2026-02-23 08:41:10.240895193 +0000 UTC m=+0.408501172 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:10:14Z)
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:41:10 np0005626463.localdomain podman[96742]: 2026-02-23 08:41:10.35198156 +0000 UTC m=+0.082411593 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:41:10 np0005626463.localdomain sshd[96673]: Connection closed by authenticating user root 116.255.155.36 port 48028 [preauth]
Feb 23 08:41:10 np0005626463.localdomain sshd[96765]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:10 np0005626463.localdomain podman[96742]: 2026-02-23 08:41:10.747644291 +0000 UTC m=+0.478074324 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13)
Feb 23 08:41:10 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:41:12 np0005626463.localdomain sshd[96765]: Connection closed by authenticating user root 116.255.155.36 port 49274 [preauth]
Feb 23 08:41:12 np0005626463.localdomain sshd[96767]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:13 np0005626463.localdomain sshd[96767]: Connection closed by authenticating user root 116.255.155.36 port 50606 [preauth]
Feb 23 08:41:14 np0005626463.localdomain sshd[96769]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:15 np0005626463.localdomain sshd[96769]: Connection closed by authenticating user root 116.255.155.36 port 51986 [preauth]
Feb 23 08:41:18 np0005626463.localdomain sshd[96771]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:19 np0005626463.localdomain sshd[96771]: Connection closed by authenticating user root 116.255.155.36 port 53282 [preauth]
Feb 23 08:41:20 np0005626463.localdomain sshd[96773]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:21 np0005626463.localdomain sshd[96773]: Connection closed by authenticating user root 116.255.155.36 port 56966 [preauth]
Feb 23 08:41:21 np0005626463.localdomain sshd[96775]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:23 np0005626463.localdomain sshd[96775]: Connection closed by authenticating user root 116.255.155.36 port 58202 [preauth]
Feb 23 08:41:23 np0005626463.localdomain sshd[96777]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:24 np0005626463.localdomain sshd[96777]: Connection closed by authenticating user root 116.255.155.36 port 59420 [preauth]
Feb 23 08:41:24 np0005626463.localdomain sshd[96779]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:26 np0005626463.localdomain sshd[96779]: Connection closed by authenticating user root 116.255.155.36 port 60584 [preauth]
Feb 23 08:41:26 np0005626463.localdomain sshd[96781]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:27 np0005626463.localdomain sshd[96783]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:27 np0005626463.localdomain sshd[96781]: Connection closed by authenticating user root 116.255.155.36 port 33490 [preauth]
Feb 23 08:41:27 np0005626463.localdomain sshd[96785]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:28 np0005626463.localdomain sshd[96787]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:28 np0005626463.localdomain sshd[96787]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:41:29 np0005626463.localdomain sshd[96785]: Connection closed by authenticating user root 116.255.155.36 port 34726 [preauth]
Feb 23 08:41:29 np0005626463.localdomain sshd[96789]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:29 np0005626463.localdomain sshd[96783]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:41:30 np0005626463.localdomain sshd[96789]: Connection closed by authenticating user root 116.255.155.36 port 35878 [preauth]
Feb 23 08:41:30 np0005626463.localdomain sshd[96791]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:32 np0005626463.localdomain sshd[96791]: Connection closed by authenticating user root 116.255.155.36 port 37072 [preauth]
Feb 23 08:41:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:41:32 np0005626463.localdomain systemd[1]: tmp-crun.Q8VtMk.mount: Deactivated successfully.
Feb 23 08:41:32 np0005626463.localdomain podman[96793]: 2026-02-23 08:41:32.190828963 +0000 UTC m=+0.097144312 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, container_name=collectd)
Feb 23 08:41:32 np0005626463.localdomain podman[96793]: 2026-02-23 08:41:32.202226537 +0000 UTC m=+0.108541906 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:41:32 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:41:32 np0005626463.localdomain sshd[96815]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:33 np0005626463.localdomain sshd[96815]: Connection closed by authenticating user root 116.255.155.36 port 38384 [preauth]
Feb 23 08:41:34 np0005626463.localdomain sshd[96817]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:35 np0005626463.localdomain sshd[96817]: Connection closed by authenticating user root 116.255.155.36 port 39650 [preauth]
Feb 23 08:41:36 np0005626463.localdomain sshd[96819]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:37 np0005626463.localdomain sshd[96819]: Connection closed by authenticating user root 116.255.155.36 port 40980 [preauth]
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:41:37 np0005626463.localdomain podman[96821]: 2026-02-23 08:41:37.656177988 +0000 UTC m=+0.093575289 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, version=17.1.13, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: tmp-crun.jNbHDe.mount: Deactivated successfully.
Feb 23 08:41:37 np0005626463.localdomain podman[96821]: 2026-02-23 08:41:37.700405059 +0000 UTC m=+0.137802390 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:41:37 np0005626463.localdomain podman[96822]: 2026-02-23 08:41:37.718397944 +0000 UTC m=+0.152875622 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T23:07:47Z)
Feb 23 08:41:37 np0005626463.localdomain podman[96830]: 2026-02-23 08:41:37.680193024 +0000 UTC m=+0.102467752 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510)
Feb 23 08:41:37 np0005626463.localdomain podman[96830]: 2026-02-23 08:41:37.759089493 +0000 UTC m=+0.181364241 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:41:37 np0005626463.localdomain podman[96824]: 2026-02-23 08:41:37.777386267 +0000 UTC m=+0.203539949 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, version=17.1.13)
Feb 23 08:41:37 np0005626463.localdomain podman[96822]: 2026-02-23 08:41:37.781429566 +0000 UTC m=+0.215907214 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:41:37 np0005626463.localdomain podman[96824]: 2026-02-23 08:41:37.817282081 +0000 UTC m=+0.243435813 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:41:37 np0005626463.localdomain podman[96823]: 2026-02-23 08:41:37.871112069 +0000 UTC m=+0.301838787 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.)
Feb 23 08:41:37 np0005626463.localdomain podman[96823]: 2026-02-23 08:41:37.904313539 +0000 UTC m=+0.335040217 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi)
Feb 23 08:41:37 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:41:38 np0005626463.localdomain sshd[96934]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:39 np0005626463.localdomain sshd[96934]: Invalid user user from 116.255.155.36 port 43312
Feb 23 08:41:39 np0005626463.localdomain sshd[96934]: Connection closed by invalid user user 116.255.155.36 port 43312 [preauth]
Feb 23 08:41:40 np0005626463.localdomain sshd[96936]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:41:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:41:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:41:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:41:40 np0005626463.localdomain podman[96940]: 2026-02-23 08:41:40.914968671 +0000 UTC m=+0.080427219 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 23 08:41:40 np0005626463.localdomain podman[96939]: 2026-02-23 08:41:40.985257004 +0000 UTC m=+0.153478960 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:41:41 np0005626463.localdomain podman[96939]: 2026-02-23 08:41:41.038078891 +0000 UTC m=+0.206300827 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=ovn_controller, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:41:41 np0005626463.localdomain podman[96939]: unhealthy
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:41:41 np0005626463.localdomain podman[96944]: 2026-02-23 08:41:41.040483388 +0000 UTC m=+0.199192591 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:41:41 np0005626463.localdomain podman[96938]: 2026-02-23 08:41:41.107472885 +0000 UTC m=+0.279726990 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:41:41 np0005626463.localdomain podman[96940]: 2026-02-23 08:41:41.161352936 +0000 UTC m=+0.326811514 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 23 08:41:41 np0005626463.localdomain podman[96940]: unhealthy
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:41:41 np0005626463.localdomain podman[96944]: 2026-02-23 08:41:41.269537599 +0000 UTC m=+0.428246812 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:41:41 np0005626463.localdomain sshd[96936]: Invalid user user from 116.255.155.36 port 44764
Feb 23 08:41:41 np0005626463.localdomain podman[96938]: 2026-02-23 08:41:41.518331052 +0000 UTC m=+0.690585177 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:41:41 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:41:41 np0005626463.localdomain sshd[96936]: Connection closed by invalid user user 116.255.155.36 port 44764 [preauth]
Feb 23 08:41:41 np0005626463.localdomain sshd[97027]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:42 np0005626463.localdomain sudo[97029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:41:42 np0005626463.localdomain sudo[97029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:41:42 np0005626463.localdomain sudo[97029]: pam_unix(sudo:session): session closed for user root
Feb 23 08:41:42 np0005626463.localdomain sudo[97044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:41:42 np0005626463.localdomain sudo[97044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:41:42 np0005626463.localdomain sshd[97027]: Invalid user user from 116.255.155.36 port 46038
Feb 23 08:41:43 np0005626463.localdomain sudo[97044]: pam_unix(sudo:session): session closed for user root
Feb 23 08:41:43 np0005626463.localdomain sshd[97027]: Connection closed by invalid user user 116.255.155.36 port 46038 [preauth]
Feb 23 08:41:43 np0005626463.localdomain sshd[97090]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:43 np0005626463.localdomain sudo[97092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:41:43 np0005626463.localdomain sudo[97092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:41:43 np0005626463.localdomain sudo[97092]: pam_unix(sudo:session): session closed for user root
Feb 23 08:41:44 np0005626463.localdomain sshd[97090]: Invalid user user from 116.255.155.36 port 47222
Feb 23 08:41:44 np0005626463.localdomain sshd[97090]: Connection closed by invalid user user 116.255.155.36 port 47222 [preauth]
Feb 23 08:41:44 np0005626463.localdomain sshd[97107]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:45 np0005626463.localdomain sshd[97107]: Invalid user user from 116.255.155.36 port 48370
Feb 23 08:41:46 np0005626463.localdomain sshd[97107]: Connection closed by invalid user user 116.255.155.36 port 48370 [preauth]
Feb 23 08:41:46 np0005626463.localdomain sshd[97109]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:47 np0005626463.localdomain sshd[97109]: Invalid user user from 116.255.155.36 port 49602
Feb 23 08:41:47 np0005626463.localdomain sshd[97109]: Connection closed by invalid user user 116.255.155.36 port 49602 [preauth]
Feb 23 08:41:47 np0005626463.localdomain sshd[97111]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:48 np0005626463.localdomain sshd[97111]: Invalid user user from 116.255.155.36 port 50740
Feb 23 08:41:49 np0005626463.localdomain sshd[97111]: Connection closed by invalid user user 116.255.155.36 port 50740 [preauth]
Feb 23 08:41:49 np0005626463.localdomain sshd[97113]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:50 np0005626463.localdomain sshd[97113]: Invalid user user from 116.255.155.36 port 52246
Feb 23 08:41:50 np0005626463.localdomain sshd[97113]: Connection closed by invalid user user 116.255.155.36 port 52246 [preauth]
Feb 23 08:41:50 np0005626463.localdomain sshd[97115]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:51 np0005626463.localdomain sshd[97115]: Invalid user user from 116.255.155.36 port 53292
Feb 23 08:41:52 np0005626463.localdomain sshd[97115]: Connection closed by invalid user user 116.255.155.36 port 53292 [preauth]
Feb 23 08:41:52 np0005626463.localdomain sshd[97117]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:53 np0005626463.localdomain sshd[97117]: Invalid user user from 116.255.155.36 port 54526
Feb 23 08:41:53 np0005626463.localdomain sshd[97117]: Connection closed by invalid user user 116.255.155.36 port 54526 [preauth]
Feb 23 08:41:54 np0005626463.localdomain sshd[97119]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:55 np0005626463.localdomain sshd[97119]: Invalid user user from 116.255.155.36 port 55672
Feb 23 08:41:55 np0005626463.localdomain sshd[97119]: Connection closed by invalid user user 116.255.155.36 port 55672 [preauth]
Feb 23 08:41:55 np0005626463.localdomain sshd[97121]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:56 np0005626463.localdomain sshd[97121]: Invalid user user from 116.255.155.36 port 56922
Feb 23 08:41:56 np0005626463.localdomain sshd[97121]: Connection closed by invalid user user 116.255.155.36 port 56922 [preauth]
Feb 23 08:41:57 np0005626463.localdomain sshd[97123]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:57 np0005626463.localdomain sshd[97123]: Invalid user user from 116.255.155.36 port 57982
Feb 23 08:41:58 np0005626463.localdomain sshd[97123]: Connection closed by invalid user user 116.255.155.36 port 57982 [preauth]
Feb 23 08:41:58 np0005626463.localdomain sshd[97125]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:41:59 np0005626463.localdomain sshd[97125]: Invalid user user from 116.255.155.36 port 59114
Feb 23 08:41:59 np0005626463.localdomain sshd[97125]: Connection closed by invalid user user 116.255.155.36 port 59114 [preauth]
Feb 23 08:41:59 np0005626463.localdomain sshd[97127]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:00 np0005626463.localdomain sshd[97127]: Invalid user user from 116.255.155.36 port 60222
Feb 23 08:42:00 np0005626463.localdomain sshd[97127]: Connection closed by invalid user user 116.255.155.36 port 60222 [preauth]
Feb 23 08:42:01 np0005626463.localdomain sshd[97129]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:01 np0005626463.localdomain sshd[97129]: Invalid user user from 116.255.155.36 port 33064
Feb 23 08:42:02 np0005626463.localdomain sshd[97129]: Connection closed by invalid user user 116.255.155.36 port 33064 [preauth]
Feb 23 08:42:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:42:02 np0005626463.localdomain podman[97131]: 2026-02-23 08:42:02.351822452 +0000 UTC m=+0.095460979 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13)
Feb 23 08:42:02 np0005626463.localdomain podman[97131]: 2026-02-23 08:42:02.363145203 +0000 UTC m=+0.106783750 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:42:02 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:42:02 np0005626463.localdomain sshd[97151]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:04 np0005626463.localdomain sshd[97151]: Invalid user user from 116.255.155.36 port 34220
Feb 23 08:42:04 np0005626463.localdomain sshd[97151]: Connection closed by invalid user user 116.255.155.36 port 34220 [preauth]
Feb 23 08:42:04 np0005626463.localdomain sshd[97153]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:05 np0005626463.localdomain sshd[97153]: Invalid user user from 116.255.155.36 port 35698
Feb 23 08:42:05 np0005626463.localdomain sshd[97153]: Connection closed by invalid user user 116.255.155.36 port 35698 [preauth]
Feb 23 08:42:05 np0005626463.localdomain sshd[97155]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:06 np0005626463.localdomain sshd[97155]: Invalid user user from 116.255.155.36 port 36812
Feb 23 08:42:07 np0005626463.localdomain sshd[97155]: Connection closed by invalid user user 116.255.155.36 port 36812 [preauth]
Feb 23 08:42:07 np0005626463.localdomain sshd[97157]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:07 np0005626463.localdomain sshd[97159]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:42:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:42:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:42:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:42:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:42:07 np0005626463.localdomain podman[97162]: 2026-02-23 08:42:07.979389437 +0000 UTC m=+0.146778238 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true)
Feb 23 08:42:07 np0005626463.localdomain podman[97161]: 2026-02-23 08:42:07.93719709 +0000 UTC m=+0.110361765 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:42:08 np0005626463.localdomain podman[97162]: 2026-02-23 08:42:08.035485218 +0000 UTC m=+0.202873969 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:42:08 np0005626463.localdomain podman[97161]: 2026-02-23 08:42:08.070007919 +0000 UTC m=+0.243172544 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 23 08:42:08 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:42:08 np0005626463.localdomain podman[97198]: 2026-02-23 08:42:08.046691825 +0000 UTC m=+0.100470088 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 23 08:42:08 np0005626463.localdomain podman[97198]: 2026-02-23 08:42:08.127118143 +0000 UTC m=+0.180896386 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1)
Feb 23 08:42:08 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:42:08 np0005626463.localdomain podman[97199]: 2026-02-23 08:42:08.143131594 +0000 UTC m=+0.192572508 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:42:08 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:42:08 np0005626463.localdomain podman[97163]: 2026-02-23 08:42:08.078183011 +0000 UTC m=+0.243434983 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, managed_by=tripleo_ansible)
Feb 23 08:42:08 np0005626463.localdomain podman[97199]: 2026-02-23 08:42:08.201977822 +0000 UTC m=+0.251418716 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, version=17.1.13, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:42:08 np0005626463.localdomain podman[97163]: 2026-02-23 08:42:08.209565345 +0000 UTC m=+0.374817267 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z)
Feb 23 08:42:08 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:42:08 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:42:08 np0005626463.localdomain sshd[97157]: Invalid user user from 116.255.155.36 port 37868
Feb 23 08:42:08 np0005626463.localdomain sshd[97157]: Connection closed by invalid user user 116.255.155.36 port 37868 [preauth]
Feb 23 08:42:08 np0005626463.localdomain sshd[97159]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:42:08 np0005626463.localdomain sshd[97270]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:09 np0005626463.localdomain sshd[97270]: Invalid user user from 116.255.155.36 port 39290
Feb 23 08:42:10 np0005626463.localdomain sshd[97270]: Connection closed by invalid user user 116.255.155.36 port 39290 [preauth]
Feb 23 08:42:10 np0005626463.localdomain sshd[97272]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:11 np0005626463.localdomain sshd[97272]: Invalid user user from 116.255.155.36 port 40590
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:42:11 np0005626463.localdomain podman[97274]: 2026-02-23 08:42:11.720132334 +0000 UTC m=+0.091046317 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z)
Feb 23 08:42:11 np0005626463.localdomain podman[97275]: 2026-02-23 08:42:11.781831464 +0000 UTC m=+0.146589651 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Feb 23 08:42:11 np0005626463.localdomain podman[97276]: 2026-02-23 08:42:11.855301959 +0000 UTC m=+0.219535469 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1766032510, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:42:11 np0005626463.localdomain sshd[97272]: Connection closed by invalid user user 116.255.155.36 port 40590 [preauth]
Feb 23 08:42:11 np0005626463.localdomain podman[97276]: 2026-02-23 08:42:11.893947103 +0000 UTC m=+0.258180613 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64)
Feb 23 08:42:11 np0005626463.localdomain podman[97276]: unhealthy
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:42:11 np0005626463.localdomain podman[97280]: 2026-02-23 08:42:11.945953314 +0000 UTC m=+0.304627137 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:42:11 np0005626463.localdomain podman[97275]: 2026-02-23 08:42:11.964818735 +0000 UTC m=+0.329576852 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:42:11 np0005626463.localdomain podman[97275]: unhealthy
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:42:11 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:42:12 np0005626463.localdomain podman[97274]: 2026-02-23 08:42:12.088199065 +0000 UTC m=+0.459113058 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:42:12 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:42:12 np0005626463.localdomain podman[97280]: 2026-02-23 08:42:12.134999458 +0000 UTC m=+0.493673191 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.)
Feb 23 08:42:12 np0005626463.localdomain sshd[97366]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:12 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:42:12 np0005626463.localdomain systemd[1]: tmp-crun.jSyDl0.mount: Deactivated successfully.
Feb 23 08:42:13 np0005626463.localdomain sshd[97366]: Invalid user user from 116.255.155.36 port 41858
Feb 23 08:42:13 np0005626463.localdomain sshd[97366]: Connection closed by invalid user user 116.255.155.36 port 41858 [preauth]
Feb 23 08:42:13 np0005626463.localdomain sshd[97368]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:13 np0005626463.localdomain sshd[97370]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:13 np0005626463.localdomain sshd[97368]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:42:14 np0005626463.localdomain sshd[97370]: Invalid user user from 116.255.155.36 port 42968
Feb 23 08:42:15 np0005626463.localdomain sshd[97370]: Connection closed by invalid user user 116.255.155.36 port 42968 [preauth]
Feb 23 08:42:15 np0005626463.localdomain sshd[97372]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:16 np0005626463.localdomain sshd[97372]: Invalid user user from 116.255.155.36 port 44162
Feb 23 08:42:16 np0005626463.localdomain sshd[97372]: Connection closed by invalid user user 116.255.155.36 port 44162 [preauth]
Feb 23 08:42:16 np0005626463.localdomain sshd[97374]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:17 np0005626463.localdomain sshd[97374]: Invalid user user from 116.255.155.36 port 45258
Feb 23 08:42:18 np0005626463.localdomain sshd[97374]: Connection closed by invalid user user 116.255.155.36 port 45258 [preauth]
Feb 23 08:42:18 np0005626463.localdomain sshd[97376]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:19 np0005626463.localdomain sshd[97376]: Invalid user user from 116.255.155.36 port 46726
Feb 23 08:42:19 np0005626463.localdomain sshd[97376]: Connection closed by invalid user user 116.255.155.36 port 46726 [preauth]
Feb 23 08:42:20 np0005626463.localdomain sshd[97378]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:21 np0005626463.localdomain sshd[97378]: Invalid user user from 116.255.155.36 port 47844
Feb 23 08:42:22 np0005626463.localdomain sshd[97378]: Connection closed by invalid user user 116.255.155.36 port 47844 [preauth]
Feb 23 08:42:22 np0005626463.localdomain sshd[97380]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:23 np0005626463.localdomain sshd[97380]: Invalid user user from 116.255.155.36 port 50068
Feb 23 08:42:23 np0005626463.localdomain sshd[97380]: Connection closed by invalid user user 116.255.155.36 port 50068 [preauth]
Feb 23 08:42:24 np0005626463.localdomain sshd[97382]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:24 np0005626463.localdomain sshd[97382]: Invalid user user from 116.255.155.36 port 51446
Feb 23 08:42:25 np0005626463.localdomain sshd[97382]: Connection closed by invalid user user 116.255.155.36 port 51446 [preauth]
Feb 23 08:42:28 np0005626463.localdomain sshd[97384]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:29 np0005626463.localdomain sshd[97384]: Invalid user user from 116.255.155.36 port 52478
Feb 23 08:42:29 np0005626463.localdomain sshd[97384]: Connection closed by invalid user user 116.255.155.36 port 52478 [preauth]
Feb 23 08:42:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:42:32 np0005626463.localdomain systemd[1]: tmp-crun.dH191E.mount: Deactivated successfully.
Feb 23 08:42:32 np0005626463.localdomain podman[97386]: 2026-02-23 08:42:32.925602078 +0000 UTC m=+0.097557775 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 23 08:42:32 np0005626463.localdomain podman[97386]: 2026-02-23 08:42:32.936503997 +0000 UTC m=+0.108459654 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1766032510, version=17.1.13, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5)
Feb 23 08:42:32 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:42:37 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:42:37 np0005626463.localdomain recover_tripleo_nova_virtqemud[97407]: 61982
Feb 23 08:42:37 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:42:37 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: tmp-crun.Pn5lmX.mount: Deactivated successfully.
Feb 23 08:42:38 np0005626463.localdomain podman[97408]: 2026-02-23 08:42:38.978034264 +0000 UTC m=+0.149217394 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible)
Feb 23 08:42:38 np0005626463.localdomain podman[97410]: 2026-02-23 08:42:38.983419106 +0000 UTC m=+0.149903306 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:42:38 np0005626463.localdomain podman[97408]: 2026-02-23 08:42:38.987996073 +0000 UTC m=+0.159179213 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-iscsid-container, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:42:38 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:42:39 np0005626463.localdomain podman[97409]: 2026-02-23 08:42:39.031542482 +0000 UTC m=+0.198110585 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, version=17.1.13)
Feb 23 08:42:39 np0005626463.localdomain podman[97411]: 2026-02-23 08:42:38.990302046 +0000 UTC m=+0.150187516 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container)
Feb 23 08:42:39 np0005626463.localdomain podman[97409]: 2026-02-23 08:42:39.056432787 +0000 UTC m=+0.223000860 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1)
Feb 23 08:42:39 np0005626463.localdomain podman[97410]: 2026-02-23 08:42:39.057099118 +0000 UTC m=+0.223583328 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:42:39 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:42:39 np0005626463.localdomain podman[97417]: 2026-02-23 08:42:39.073240114 +0000 UTC m=+0.232516894 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:42:39 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:42:39 np0005626463.localdomain podman[97411]: 2026-02-23 08:42:39.12072977 +0000 UTC m=+0.280615270 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:42:39 np0005626463.localdomain podman[97417]: 2026-02-23 08:42:39.131329398 +0000 UTC m=+0.290606188 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:42:39 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:42:39 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:42:39 np0005626463.localdomain systemd[1]: tmp-crun.nAg2iZ.mount: Deactivated successfully.
Feb 23 08:42:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:42:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:42:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:42:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:42:42 np0005626463.localdomain systemd[1]: tmp-crun.NbVNQc.mount: Deactivated successfully.
Feb 23 08:42:42 np0005626463.localdomain podman[97527]: 2026-02-23 08:42:42.988743714 +0000 UTC m=+0.147389047 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible)
Feb 23 08:42:43 np0005626463.localdomain podman[97524]: 2026-02-23 08:42:43.026572071 +0000 UTC m=+0.192493176 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git)
Feb 23 08:42:43 np0005626463.localdomain podman[97525]: 2026-02-23 08:42:42.941995371 +0000 UTC m=+0.105875581 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:42:43 np0005626463.localdomain podman[97526]: 2026-02-23 08:42:43.077097884 +0000 UTC m=+0.237946477 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, io.openshift.expose-services=)
Feb 23 08:42:43 np0005626463.localdomain podman[97526]: 2026-02-23 08:42:43.09577189 +0000 UTC m=+0.256620533 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64)
Feb 23 08:42:43 np0005626463.localdomain podman[97526]: unhealthy
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:42:43 np0005626463.localdomain podman[97525]: 2026-02-23 08:42:43.132285126 +0000 UTC m=+0.296165296 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 23 08:42:43 np0005626463.localdomain podman[97525]: unhealthy
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:42:43 np0005626463.localdomain podman[97527]: 2026-02-23 08:42:43.195283407 +0000 UTC m=+0.353928730 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:42:43 np0005626463.localdomain podman[97524]: 2026-02-23 08:42:43.441491157 +0000 UTC m=+0.607412272 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:42:43 np0005626463.localdomain sudo[97620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:42:43 np0005626463.localdomain sudo[97620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:42:43 np0005626463.localdomain sudo[97620]: pam_unix(sudo:session): session closed for user root
Feb 23 08:42:43 np0005626463.localdomain systemd[1]: tmp-crun.wEwyWT.mount: Deactivated successfully.
Feb 23 08:42:43 np0005626463.localdomain sudo[97635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:42:43 np0005626463.localdomain sudo[97635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:42:44 np0005626463.localdomain systemd[1]: tmp-crun.kziQbG.mount: Deactivated successfully.
Feb 23 08:42:44 np0005626463.localdomain podman[97722]: 2026-02-23 08:42:44.920478902 +0000 UTC m=+0.110163548 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Feb 23 08:42:45 np0005626463.localdomain podman[97722]: 2026-02-23 08:42:45.034191142 +0000 UTC m=+0.223875818 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, version=7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Feb 23 08:42:45 np0005626463.localdomain sudo[97635]: pam_unix(sudo:session): session closed for user root
Feb 23 08:42:45 np0005626463.localdomain sudo[97791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:42:45 np0005626463.localdomain sudo[97791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:42:45 np0005626463.localdomain sudo[97791]: pam_unix(sudo:session): session closed for user root
Feb 23 08:42:45 np0005626463.localdomain sudo[97806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:42:45 np0005626463.localdomain sudo[97806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:42:45 np0005626463.localdomain sshd[97821]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:46 np0005626463.localdomain sshd[97821]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:42:46 np0005626463.localdomain sudo[97806]: pam_unix(sudo:session): session closed for user root
Feb 23 08:42:46 np0005626463.localdomain sudo[97855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:42:46 np0005626463.localdomain sudo[97855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:42:46 np0005626463.localdomain sudo[97855]: pam_unix(sudo:session): session closed for user root
Feb 23 08:42:54 np0005626463.localdomain sshd[97870]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:42:55 np0005626463.localdomain sshd[97870]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:43:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:43:03 np0005626463.localdomain systemd[1]: tmp-crun.eTyoD0.mount: Deactivated successfully.
Feb 23 08:43:03 np0005626463.localdomain podman[97872]: 2026-02-23 08:43:03.941672655 +0000 UTC m=+0.106621334 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z)
Feb 23 08:43:03 np0005626463.localdomain podman[97872]: 2026-02-23 08:43:03.958390529 +0000 UTC m=+0.123339198 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 23 08:43:03 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:43:09 np0005626463.localdomain podman[97893]: 2026-02-23 08:43:09.930996439 +0000 UTC m=+0.097821564 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510)
Feb 23 08:43:09 np0005626463.localdomain podman[97893]: 2026-02-23 08:43:09.987228224 +0000 UTC m=+0.154053369 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git)
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:43:10 np0005626463.localdomain podman[97892]: 2026-02-23 08:43:09.989546528 +0000 UTC m=+0.160252087 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:43:10 np0005626463.localdomain podman[97894]: 2026-02-23 08:43:10.053053246 +0000 UTC m=+0.216963038 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:43:10 np0005626463.localdomain podman[97892]: 2026-02-23 08:43:10.082487745 +0000 UTC m=+0.253193334 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1766032510, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 23 08:43:10 np0005626463.localdomain podman[97900]: 2026-02-23 08:43:10.093735864 +0000 UTC m=+0.249354582 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:43:10 np0005626463.localdomain podman[97900]: 2026-02-23 08:43:10.102316018 +0000 UTC m=+0.257934696 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond)
Feb 23 08:43:10 np0005626463.localdomain podman[97894]: 2026-02-23 08:43:10.110099676 +0000 UTC m=+0.274009458 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:43:10 np0005626463.localdomain podman[97906]: 2026-02-23 08:43:10.193956713 +0000 UTC m=+0.348539907 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5)
Feb 23 08:43:10 np0005626463.localdomain podman[97906]: 2026-02-23 08:43:10.24864449 +0000 UTC m=+0.403227714 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1)
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:43:10 np0005626463.localdomain systemd[1]: tmp-crun.zw2yvM.mount: Deactivated successfully.
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:43:13 np0005626463.localdomain podman[98012]: 2026-02-23 08:43:13.923994173 +0000 UTC m=+0.091520894 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:43:13 np0005626463.localdomain podman[98012]: 2026-02-23 08:43:13.939451644 +0000 UTC m=+0.106978395 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 23 08:43:13 np0005626463.localdomain podman[98012]: unhealthy
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:43:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: tmp-crun.Yxb39x.mount: Deactivated successfully.
Feb 23 08:43:14 np0005626463.localdomain podman[98013]: 2026-02-23 08:43:14.032270566 +0000 UTC m=+0.198808496 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git)
Feb 23 08:43:14 np0005626463.localdomain podman[98013]: 2026-02-23 08:43:14.077403411 +0000 UTC m=+0.243941391 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public)
Feb 23 08:43:14 np0005626463.localdomain podman[98013]: unhealthy
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:43:14 np0005626463.localdomain podman[98014]: 2026-02-23 08:43:14.099930769 +0000 UTC m=+0.258838725 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:43:14 np0005626463.localdomain podman[98011]: 2026-02-23 08:43:14.134724045 +0000 UTC m=+0.305136178 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 23 08:43:14 np0005626463.localdomain podman[98014]: 2026-02-23 08:43:14.309327877 +0000 UTC m=+0.468235773 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, release=1766032510, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:43:14 np0005626463.localdomain podman[98011]: 2026-02-23 08:43:14.503385289 +0000 UTC m=+0.673797372 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:43:14 np0005626463.localdomain systemd[1]: tmp-crun.CCby9S.mount: Deactivated successfully.
Feb 23 08:43:23 np0005626463.localdomain sshd[98104]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:43:23 np0005626463.localdomain sshd[98104]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:43:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:43:34 np0005626463.localdomain systemd[1]: tmp-crun.GEDYu1.mount: Deactivated successfully.
Feb 23 08:43:34 np0005626463.localdomain podman[98106]: 2026-02-23 08:43:34.94295472 +0000 UTC m=+0.104173503 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 23 08:43:34 np0005626463.localdomain podman[98106]: 2026-02-23 08:43:34.95144578 +0000 UTC m=+0.112664573 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:43:34 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:43:40 np0005626463.localdomain sshd[98127]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:43:40 np0005626463.localdomain systemd[1]: tmp-crun.ZCXlBN.mount: Deactivated successfully.
Feb 23 08:43:40 np0005626463.localdomain podman[98129]: 2026-02-23 08:43:40.913626306 +0000 UTC m=+0.088761534 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:43:40 np0005626463.localdomain podman[98132]: 2026-02-23 08:43:40.924998137 +0000 UTC m=+0.090242090 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron)
Feb 23 08:43:40 np0005626463.localdomain podman[98130]: 2026-02-23 08:43:40.964846674 +0000 UTC m=+0.136162030 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 23 08:43:40 np0005626463.localdomain podman[98138]: 2026-02-23 08:43:40.976579738 +0000 UTC m=+0.138581048 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Feb 23 08:43:41 np0005626463.localdomain podman[98138]: 2026-02-23 08:43:41.005293911 +0000 UTC m=+0.167295251 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:43:41 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:43:41 np0005626463.localdomain podman[98130]: 2026-02-23 08:43:41.021664312 +0000 UTC m=+0.192979688 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:43:41 np0005626463.localdomain podman[98131]: 2026-02-23 08:43:41.021942241 +0000 UTC m=+0.191518753 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 23 08:43:41 np0005626463.localdomain podman[98132]: 2026-02-23 08:43:41.040268463 +0000 UTC m=+0.205512506 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:43:41 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:43:41 np0005626463.localdomain podman[98129]: 2026-02-23 08:43:41.057368867 +0000 UTC m=+0.232504055 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=)
Feb 23 08:43:41 np0005626463.localdomain podman[98131]: 2026-02-23 08:43:41.072279971 +0000 UTC m=+0.241856463 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=)
Feb 23 08:43:41 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:43:41 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:43:41 np0005626463.localdomain sshd[98127]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:43:41 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:43:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:43:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:43:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:43:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:43:44 np0005626463.localdomain systemd[1]: tmp-crun.8foDq2.mount: Deactivated successfully.
Feb 23 08:43:44 np0005626463.localdomain podman[98244]: 2026-02-23 08:43:44.926222962 +0000 UTC m=+0.102068367 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:43:44 np0005626463.localdomain podman[98246]: 2026-02-23 08:43:44.970034735 +0000 UTC m=+0.138408722 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:43:45 np0005626463.localdomain podman[98246]: 2026-02-23 08:43:45.013440436 +0000 UTC m=+0.181814443 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:43:45 np0005626463.localdomain podman[98246]: unhealthy
Feb 23 08:43:45 np0005626463.localdomain podman[98245]: 2026-02-23 08:43:45.020591883 +0000 UTC m=+0.191237702 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:43:45 np0005626463.localdomain podman[98245]: 2026-02-23 08:43:45.06324959 +0000 UTC m=+0.233895419 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:43:45 np0005626463.localdomain podman[98245]: unhealthy
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:43:45 np0005626463.localdomain podman[98248]: 2026-02-23 08:43:45.083131652 +0000 UTC m=+0.247770050 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:43:45 np0005626463.localdomain podman[98248]: 2026-02-23 08:43:45.273799385 +0000 UTC m=+0.438437783 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:43:45 np0005626463.localdomain podman[98244]: 2026-02-23 08:43:45.317212076 +0000 UTC m=+0.493057471 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target)
Feb 23 08:43:45 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:43:46 np0005626463.localdomain sudo[98333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:43:46 np0005626463.localdomain sudo[98333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:43:46 np0005626463.localdomain sudo[98333]: pam_unix(sudo:session): session closed for user root
Feb 23 08:43:46 np0005626463.localdomain sudo[98348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:43:46 np0005626463.localdomain sudo[98348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:43:47 np0005626463.localdomain sudo[98348]: pam_unix(sudo:session): session closed for user root
Feb 23 08:43:48 np0005626463.localdomain sudo[98394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:43:48 np0005626463.localdomain sudo[98394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:43:48 np0005626463.localdomain sudo[98394]: pam_unix(sudo:session): session closed for user root
Feb 23 08:44:01 np0005626463.localdomain sshd[98409]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:44:01 np0005626463.localdomain sshd[98409]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:44:01 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:44:01 np0005626463.localdomain recover_tripleo_nova_virtqemud[98412]: 61982
Feb 23 08:44:01 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:44:01 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:44:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:44:05 np0005626463.localdomain systemd[1]: tmp-crun.QEqpTa.mount: Deactivated successfully.
Feb 23 08:44:05 np0005626463.localdomain podman[98413]: 2026-02-23 08:44:05.931608898 +0000 UTC m=+0.102392868 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 23 08:44:05 np0005626463.localdomain podman[98413]: 2026-02-23 08:44:05.946232183 +0000 UTC m=+0.117016183 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, container_name=collectd, release=1766032510, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:44:05 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:44:12 np0005626463.localdomain podman[98435]: 2026-02-23 08:44:11.951244872 +0000 UTC m=+0.116592388 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13)
Feb 23 08:44:12 np0005626463.localdomain podman[98434]: 2026-02-23 08:44:12.003069351 +0000 UTC m=+0.170806743 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5)
Feb 23 08:44:12 np0005626463.localdomain podman[98441]: 2026-02-23 08:44:12.012258913 +0000 UTC m=+0.163146959 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_compute, vcs-type=git, version=17.1.13)
Feb 23 08:44:12 np0005626463.localdomain podman[98434]: 2026-02-23 08:44:12.043748384 +0000 UTC m=+0.211485726 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public)
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:44:12 np0005626463.localdomain podman[98436]: 2026-02-23 08:44:11.912407338 +0000 UTC m=+0.078580231 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:44:12 np0005626463.localdomain podman[98435]: 2026-02-23 08:44:12.087363412 +0000 UTC m=+0.252710938 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vcs-type=git, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=)
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:44:12 np0005626463.localdomain podman[98436]: 2026-02-23 08:44:12.101190211 +0000 UTC m=+0.267363124 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:44:12 np0005626463.localdomain podman[98437]: 2026-02-23 08:44:12.151304165 +0000 UTC m=+0.307793729 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:44:12 np0005626463.localdomain podman[98441]: 2026-02-23 08:44:12.169660268 +0000 UTC m=+0.320548284 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:44:12 np0005626463.localdomain podman[98437]: 2026-02-23 08:44:12.18730341 +0000 UTC m=+0.343793024 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron)
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:44:12 np0005626463.localdomain systemd[1]: tmp-crun.cngTkO.mount: Deactivated successfully.
Feb 23 08:44:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:44:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:44:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:44:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:44:15 np0005626463.localdomain podman[98552]: 2026-02-23 08:44:15.930485719 +0000 UTC m=+0.098633357 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:44:15 np0005626463.localdomain podman[98553]: 2026-02-23 08:44:15.905924548 +0000 UTC m=+0.074450609 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, release=1766032510)
Feb 23 08:44:15 np0005626463.localdomain podman[98552]: 2026-02-23 08:44:15.99497143 +0000 UTC m=+0.163119058 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git)
Feb 23 08:44:16 np0005626463.localdomain podman[98552]: unhealthy
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:44:16 np0005626463.localdomain podman[98551]: 2026-02-23 08:44:16.017541487 +0000 UTC m=+0.189040953 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:44:16 np0005626463.localdomain podman[98554]: 2026-02-23 08:44:15.986064937 +0000 UTC m=+0.147631396 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:44:16 np0005626463.localdomain podman[98553]: 2026-02-23 08:44:16.08900802 +0000 UTC m=+0.257534091 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:44:16 np0005626463.localdomain podman[98553]: unhealthy
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:44:16 np0005626463.localdomain podman[98554]: 2026-02-23 08:44:16.188278077 +0000 UTC m=+0.349844536 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:44:16 np0005626463.localdomain podman[98551]: 2026-02-23 08:44:16.405168024 +0000 UTC m=+0.576667480 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510)
Feb 23 08:44:16 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:44:24 np0005626463.localdomain sshd[98642]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:44:25 np0005626463.localdomain sshd[98642]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:44:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:44:36 np0005626463.localdomain systemd[1]: tmp-crun.N18ZRg.mount: Deactivated successfully.
Feb 23 08:44:36 np0005626463.localdomain podman[98644]: 2026-02-23 08:44:36.924987958 +0000 UTC m=+0.099321079 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Feb 23 08:44:36 np0005626463.localdomain podman[98644]: 2026-02-23 08:44:36.940211223 +0000 UTC m=+0.114544374 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:44:36 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:44:38 np0005626463.localdomain sshd[98663]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:44:39 np0005626463.localdomain sshd[98663]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:44:42 np0005626463.localdomain podman[98666]: 2026-02-23 08:44:42.932645113 +0000 UTC m=+0.098673610 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git)
Feb 23 08:44:42 np0005626463.localdomain podman[98666]: 2026-02-23 08:44:42.963308648 +0000 UTC m=+0.129337145 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: tmp-crun.tQqJRw.mount: Deactivated successfully.
Feb 23 08:44:42 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:44:42 np0005626463.localdomain podman[98668]: 2026-02-23 08:44:42.996950588 +0000 UTC m=+0.158182252 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 23 08:44:43 np0005626463.localdomain podman[98665]: 2026-02-23 08:44:43.037199908 +0000 UTC m=+0.206572791 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.13, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z)
Feb 23 08:44:43 np0005626463.localdomain podman[98665]: 2026-02-23 08:44:43.045479451 +0000 UTC m=+0.214852324 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3)
Feb 23 08:44:43 np0005626463.localdomain podman[98668]: 2026-02-23 08:44:43.056944446 +0000 UTC m=+0.218176160 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-cron, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:44:43 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:44:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:44:43 np0005626463.localdomain podman[98667]: 2026-02-23 08:44:43.143340283 +0000 UTC m=+0.307702596 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:44:43 np0005626463.localdomain podman[98671]: 2026-02-23 08:44:43.197490315 +0000 UTC m=+0.352062697 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 23 08:44:43 np0005626463.localdomain podman[98667]: 2026-02-23 08:44:43.201189752 +0000 UTC m=+0.365552045 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:44:43 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:44:43 np0005626463.localdomain podman[98671]: 2026-02-23 08:44:43.229860515 +0000 UTC m=+0.384432907 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:44:43 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:44:43 np0005626463.localdomain sshd[98782]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:44:44 np0005626463.localdomain sshd[98782]: Invalid user 12345 from 80.94.95.116 port 63664
Feb 23 08:44:45 np0005626463.localdomain sshd[98782]: Connection closed by invalid user 12345 80.94.95.116 port 63664 [preauth]
Feb 23 08:44:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:44:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:44:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:44:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:44:46 np0005626463.localdomain podman[98784]: 2026-02-23 08:44:46.92069784 +0000 UTC m=+0.090641293 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:44:46 np0005626463.localdomain systemd[1]: tmp-crun.yY256z.mount: Deactivated successfully.
Feb 23 08:44:46 np0005626463.localdomain podman[98785]: 2026-02-23 08:44:46.98200868 +0000 UTC m=+0.147789361 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:44:47 np0005626463.localdomain podman[98787]: 2026-02-23 08:44:47.027222738 +0000 UTC m=+0.187280967 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:44:47 np0005626463.localdomain podman[98785]: 2026-02-23 08:44:47.050409896 +0000 UTC m=+0.216190587 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 08:44:47 np0005626463.localdomain podman[98785]: unhealthy
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:44:47 np0005626463.localdomain podman[98786]: 2026-02-23 08:44:47.137906678 +0000 UTC m=+0.300771466 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 23 08:44:47 np0005626463.localdomain podman[98786]: 2026-02-23 08:44:47.156575701 +0000 UTC m=+0.319440519 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z)
Feb 23 08:44:47 np0005626463.localdomain podman[98786]: unhealthy
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:44:47 np0005626463.localdomain podman[98787]: 2026-02-23 08:44:47.247245265 +0000 UTC m=+0.407303454 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1)
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:44:47 np0005626463.localdomain podman[98784]: 2026-02-23 08:44:47.337433523 +0000 UTC m=+0.507376986 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:44:47 np0005626463.localdomain systemd[1]: tmp-crun.Dhens2.mount: Deactivated successfully.
Feb 23 08:44:48 np0005626463.localdomain sudo[98876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:44:48 np0005626463.localdomain sudo[98876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:44:48 np0005626463.localdomain sudo[98876]: pam_unix(sudo:session): session closed for user root
Feb 23 08:44:48 np0005626463.localdomain sudo[98891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:44:48 np0005626463.localdomain sudo[98891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:44:49 np0005626463.localdomain sudo[98891]: pam_unix(sudo:session): session closed for user root
Feb 23 08:44:49 np0005626463.localdomain sudo[98939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:44:49 np0005626463.localdomain sudo[98939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:44:49 np0005626463.localdomain sudo[98939]: pam_unix(sudo:session): session closed for user root
Feb 23 08:45:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:45:07 np0005626463.localdomain podman[98954]: 2026-02-23 08:45:07.938950875 +0000 UTC m=+0.106402305 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_id=tripleo_step3)
Feb 23 08:45:07 np0005626463.localdomain podman[98954]: 2026-02-23 08:45:07.975934691 +0000 UTC m=+0.143386121 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, container_name=collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:10:15Z, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:45:07 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:45:10 np0005626463.localdomain sshd[98973]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:45:10 np0005626463.localdomain sshd[98973]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: tmp-crun.GZ1t3m.mount: Deactivated successfully.
Feb 23 08:45:13 np0005626463.localdomain podman[98976]: 2026-02-23 08:45:13.921905455 +0000 UTC m=+0.091980507 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, architecture=x86_64, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git)
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: tmp-crun.aY5ypL.mount: Deactivated successfully.
Feb 23 08:45:13 np0005626463.localdomain podman[98978]: 2026-02-23 08:45:13.950677119 +0000 UTC m=+0.110353810 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:45:13 np0005626463.localdomain podman[98976]: 2026-02-23 08:45:13.980288752 +0000 UTC m=+0.150363794 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Feb 23 08:45:13 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:45:14 np0005626463.localdomain podman[98975]: 2026-02-23 08:45:13.981175509 +0000 UTC m=+0.152146559 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Feb 23 08:45:14 np0005626463.localdomain podman[98977]: 2026-02-23 08:45:14.037350776 +0000 UTC m=+0.200225129 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:45:14 np0005626463.localdomain podman[98989]: 2026-02-23 08:45:14.09031844 +0000 UTC m=+0.242998659 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13)
Feb 23 08:45:14 np0005626463.localdomain podman[98975]: 2026-02-23 08:45:14.115836452 +0000 UTC m=+0.286807572 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:45:14 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:45:14 np0005626463.localdomain podman[98978]: 2026-02-23 08:45:14.139299068 +0000 UTC m=+0.298975809 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:45:14 np0005626463.localdomain podman[98989]: 2026-02-23 08:45:14.150265326 +0000 UTC m=+0.302945505 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=nova_compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:45:14 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:45:14 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:45:14 np0005626463.localdomain podman[98977]: 2026-02-23 08:45:14.170771779 +0000 UTC m=+0.333646152 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4)
Feb 23 08:45:14 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:45:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:45:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:45:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:45:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:45:17 np0005626463.localdomain systemd[1]: tmp-crun.PtZjaP.mount: Deactivated successfully.
Feb 23 08:45:17 np0005626463.localdomain podman[99093]: 2026-02-23 08:45:17.938032423 +0000 UTC m=+0.103635107 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510)
Feb 23 08:45:18 np0005626463.localdomain podman[99094]: 2026-02-23 08:45:17.8989779 +0000 UTC m=+0.065856734 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5)
Feb 23 08:45:18 np0005626463.localdomain podman[99092]: 2026-02-23 08:45:18.00492655 +0000 UTC m=+0.174963775 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:45:18 np0005626463.localdomain podman[99096]: 2026-02-23 08:45:18.012710028 +0000 UTC m=+0.173757827 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Feb 23 08:45:18 np0005626463.localdomain podman[99093]: 2026-02-23 08:45:18.027186598 +0000 UTC m=+0.192789252 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible)
Feb 23 08:45:18 np0005626463.localdomain podman[99093]: unhealthy
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:45:18 np0005626463.localdomain podman[99094]: 2026-02-23 08:45:18.082853539 +0000 UTC m=+0.249732373 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:45:18 np0005626463.localdomain podman[99094]: unhealthy
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:45:18 np0005626463.localdomain sshd[99178]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:45:18 np0005626463.localdomain podman[99096]: 2026-02-23 08:45:18.203266398 +0000 UTC m=+0.364314187 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public)
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:45:18 np0005626463.localdomain podman[99092]: 2026-02-23 08:45:18.369401351 +0000 UTC m=+0.539438606 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:45:18 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:45:18 np0005626463.localdomain sshd[99178]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:45:37 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:45:37 np0005626463.localdomain recover_tripleo_nova_virtqemud[99182]: 61982
Feb 23 08:45:37 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:45:37 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:45:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:45:38 np0005626463.localdomain podman[99183]: 2026-02-23 08:45:38.916707629 +0000 UTC m=+0.090360130 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:45:38 np0005626463.localdomain podman[99183]: 2026-02-23 08:45:38.926408493 +0000 UTC m=+0.100060964 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:45:38 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:45:44 np0005626463.localdomain systemd[1]: tmp-crun.LZgVpk.mount: Deactivated successfully.
Feb 23 08:45:44 np0005626463.localdomain podman[99204]: 2026-02-23 08:45:44.983643962 +0000 UTC m=+0.152399440 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5)
Feb 23 08:45:45 np0005626463.localdomain podman[99205]: 2026-02-23 08:45:44.94984074 +0000 UTC m=+0.115913544 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc.)
Feb 23 08:45:45 np0005626463.localdomain podman[99205]: 2026-02-23 08:45:45.028999257 +0000 UTC m=+0.195072021 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 23 08:45:45 np0005626463.localdomain podman[99206]: 2026-02-23 08:45:45.036894435 +0000 UTC m=+0.198799657 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git)
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:45:45 np0005626463.localdomain podman[99210]: 2026-02-23 08:45:45.092183161 +0000 UTC m=+0.248513378 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:45:45 np0005626463.localdomain podman[99206]: 2026-02-23 08:45:45.145760055 +0000 UTC m=+0.307665277 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:45:45 np0005626463.localdomain podman[99204]: 2026-02-23 08:45:45.172360301 +0000 UTC m=+0.341115809 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, container_name=iscsid, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, tcib_managed=true, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:45:45 np0005626463.localdomain podman[99210]: 2026-02-23 08:45:45.181331362 +0000 UTC m=+0.337661589 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:45:45 np0005626463.localdomain podman[99218]: 2026-02-23 08:45:45.151187055 +0000 UTC m=+0.302200625 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, release=1766032510, architecture=x86_64)
Feb 23 08:45:45 np0005626463.localdomain podman[99218]: 2026-02-23 08:45:45.231030674 +0000 UTC m=+0.382044204 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true)
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:45:45 np0005626463.localdomain systemd[1]: tmp-crun.0NbyUE.mount: Deactivated successfully.
Feb 23 08:45:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:45:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:45:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:45:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:45:48 np0005626463.localdomain systemd[1]: tmp-crun.GQeuwf.mount: Deactivated successfully.
Feb 23 08:45:48 np0005626463.localdomain podman[99318]: 2026-02-23 08:45:48.924009411 +0000 UTC m=+0.096418600 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64)
Feb 23 08:45:48 np0005626463.localdomain podman[99320]: 2026-02-23 08:45:48.974117575 +0000 UTC m=+0.141414644 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, tcib_managed=true)
Feb 23 08:45:49 np0005626463.localdomain podman[99319]: 2026-02-23 08:45:49.019201181 +0000 UTC m=+0.187447880 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z)
Feb 23 08:45:49 np0005626463.localdomain podman[99320]: 2026-02-23 08:45:49.044744034 +0000 UTC m=+0.212041143 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, version=17.1.13)
Feb 23 08:45:49 np0005626463.localdomain podman[99320]: unhealthy
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:45:49 np0005626463.localdomain podman[99319]: 2026-02-23 08:45:49.061294364 +0000 UTC m=+0.229541033 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:45:49 np0005626463.localdomain podman[99319]: unhealthy
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:45:49 np0005626463.localdomain podman[99321]: 2026-02-23 08:45:49.130920381 +0000 UTC m=+0.294128362 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true)
Feb 23 08:45:49 np0005626463.localdomain podman[99321]: 2026-02-23 08:45:49.331317478 +0000 UTC m=+0.494525519 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:45:49 np0005626463.localdomain podman[99318]: 2026-02-23 08:45:49.341350443 +0000 UTC m=+0.513759662 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, url=https://www.redhat.com, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:45:49 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:45:50 np0005626463.localdomain sudo[99405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:45:50 np0005626463.localdomain sudo[99405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:45:50 np0005626463.localdomain sudo[99405]: pam_unix(sudo:session): session closed for user root
Feb 23 08:45:50 np0005626463.localdomain sudo[99420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:45:50 np0005626463.localdomain sudo[99420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:45:50 np0005626463.localdomain sudo[99420]: pam_unix(sudo:session): session closed for user root
Feb 23 08:45:51 np0005626463.localdomain sudo[99466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:45:51 np0005626463.localdomain sudo[99466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:45:51 np0005626463.localdomain sudo[99466]: pam_unix(sudo:session): session closed for user root
Feb 23 08:45:53 np0005626463.localdomain sshd[99481]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:45:54 np0005626463.localdomain sshd[99481]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:45:58 np0005626463.localdomain sshd[99483]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:45:58 np0005626463.localdomain sshd[99483]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:46:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:46:09 np0005626463.localdomain systemd[1]: tmp-crun.PxAnur.mount: Deactivated successfully.
Feb 23 08:46:09 np0005626463.localdomain podman[99485]: 2026-02-23 08:46:09.934636815 +0000 UTC m=+0.098085433 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 23 08:46:09 np0005626463.localdomain podman[99485]: 2026-02-23 08:46:09.950239556 +0000 UTC m=+0.113688214 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-collectd-container)
Feb 23 08:46:09 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:46:15 np0005626463.localdomain rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:46:15 np0005626463.localdomain podman[99684]: 2026-02-23 08:46:15.929608837 +0000 UTC m=+0.095586825 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:46:15 np0005626463.localdomain podman[99684]: 2026-02-23 08:46:15.968044254 +0000 UTC m=+0.134022162 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4)
Feb 23 08:46:15 np0005626463.localdomain podman[99686]: 2026-02-23 08:46:15.965480734 +0000 UTC m=+0.124735800 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:46:15 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:46:16 np0005626463.localdomain podman[99683]: 2026-02-23 08:46:16.040940605 +0000 UTC m=+0.210157244 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:46:16 np0005626463.localdomain podman[99686]: 2026-02-23 08:46:16.055316566 +0000 UTC m=+0.214571652 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, tcib_managed=true, com.redhat.component=openstack-cron-container)
Feb 23 08:46:16 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:46:16 np0005626463.localdomain podman[99685]: 2026-02-23 08:46:16.133862474 +0000 UTC m=+0.297281431 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 23 08:46:16 np0005626463.localdomain podman[99683]: 2026-02-23 08:46:16.153044487 +0000 UTC m=+0.322261116 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Feb 23 08:46:16 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:46:16 np0005626463.localdomain podman[99685]: 2026-02-23 08:46:16.197281157 +0000 UTC m=+0.360700084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public)
Feb 23 08:46:16 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:46:16 np0005626463.localdomain podman[99692]: 2026-02-23 08:46:16.248911058 +0000 UTC m=+0.405729158 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:46:16 np0005626463.localdomain podman[99692]: 2026-02-23 08:46:16.280293244 +0000 UTC m=+0.437111304 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step5, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:46:16 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:46:16 np0005626463.localdomain systemd[1]: tmp-crun.R7bTyC.mount: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: tmp-crun.zNZV86.mount: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain podman[99803]: 2026-02-23 08:46:20.210098063 +0000 UTC m=+0.053030530 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true)
Feb 23 08:46:20 np0005626463.localdomain podman[99803]: 2026-02-23 08:46:20.228032188 +0000 UTC m=+0.070964635 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: tmp-crun.p9Mrkl.mount: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain podman[99802]: 2026-02-23 08:46:20.261921384 +0000 UTC m=+0.104927153 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, tcib_managed=true)
Feb 23 08:46:20 np0005626463.localdomain podman[99805]: 2026-02-23 08:46:20.312964621 +0000 UTC m=+0.150726365 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com)
Feb 23 08:46:20 np0005626463.localdomain podman[99804]: 2026-02-23 08:46:20.363918314 +0000 UTC m=+0.203193225 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:46:20 np0005626463.localdomain podman[99804]: 2026-02-23 08:46:20.421225088 +0000 UTC m=+0.260500009 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc.)
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain podman[99805]: 2026-02-23 08:46:20.470184138 +0000 UTC m=+0.307945882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://www.redhat.com, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:46:20 np0005626463.localdomain podman[99802]: 2026-02-23 08:46:20.609212754 +0000 UTC m=+0.452218523 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:46:20 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:46:34 np0005626463.localdomain sshd[35794]: Received disconnect from 192.168.122.100 port 52882:11: disconnected by user
Feb 23 08:46:34 np0005626463.localdomain sshd[35794]: Disconnected from user tripleo-admin 192.168.122.100 port 52882
Feb 23 08:46:34 np0005626463.localdomain sshd[35774]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 23 08:46:34 np0005626463.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Feb 23 08:46:34 np0005626463.localdomain systemd[1]: session-28.scope: Consumed 7min 16.051s CPU time.
Feb 23 08:46:34 np0005626463.localdomain systemd-logind[759]: Session 28 logged out. Waiting for processes to exit.
Feb 23 08:46:34 np0005626463.localdomain systemd-logind[759]: Removed session 28.
Feb 23 08:46:35 np0005626463.localdomain sshd[99902]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:46:39 np0005626463.localdomain sshd[99904]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:46:40 np0005626463.localdomain sshd[99904]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:46:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:46:40 np0005626463.localdomain podman[99906]: 2026-02-23 08:46:40.537219425 +0000 UTC m=+0.100674334 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd)
Feb 23 08:46:40 np0005626463.localdomain podman[99906]: 2026-02-23 08:46:40.578357878 +0000 UTC m=+0.141812767 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:46:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:46:42 np0005626463.localdomain sshd[99902]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Activating special unit Exit the Session...
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Removed slice User Background Tasks Slice.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped target Main User Target.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped target Basic System.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped target Paths.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped target Sockets.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped target Timers.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Closed D-Bus User Message Bus Socket.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Stopped Create User's Volatile Files and Directories.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Removed slice User Application Slice.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Reached target Shutdown.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Finished Exit the Session.
Feb 23 08:46:45 np0005626463.localdomain systemd[35778]: Reached target Exit the Session.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: user@1003.service: Consumed 4.890s CPU time, read 0B from disk, written 7.0K to disk.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 23 08:46:45 np0005626463.localdomain systemd[1]: user-1003.slice: Consumed 7min 20.971s CPU time.
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:46:46 np0005626463.localdomain podman[99930]: 2026-02-23 08:46:46.946965056 +0000 UTC m=+0.110118881 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, version=17.1.13)
Feb 23 08:46:46 np0005626463.localdomain systemd[1]: tmp-crun.V5Z92e.mount: Deactivated successfully.
Feb 23 08:46:46 np0005626463.localdomain podman[99938]: 2026-02-23 08:46:46.994701116 +0000 UTC m=+0.148055502 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:46:47 np0005626463.localdomain podman[99930]: 2026-02-23 08:46:47.000268951 +0000 UTC m=+0.163422746 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, build-date=2026-01-12T23:07:47Z)
Feb 23 08:46:47 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:46:47 np0005626463.localdomain podman[99938]: 2026-02-23 08:46:47.025829235 +0000 UTC m=+0.179183611 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:46:47 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:46:47 np0005626463.localdomain podman[99931]: 2026-02-23 08:46:47.043973674 +0000 UTC m=+0.204262298 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:46:47 np0005626463.localdomain podman[99932]: 2026-02-23 08:46:47.098968892 +0000 UTC m=+0.255185218 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_id=tripleo_step4)
Feb 23 08:46:47 np0005626463.localdomain podman[99932]: 2026-02-23 08:46:47.111431904 +0000 UTC m=+0.267648240 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 23 08:46:47 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:46:47 np0005626463.localdomain podman[99931]: 2026-02-23 08:46:47.127283342 +0000 UTC m=+0.287571956 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64)
Feb 23 08:46:47 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:46:47 np0005626463.localdomain podman[99929]: 2026-02-23 08:46:47.203776395 +0000 UTC m=+0.372616748 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:46:47 np0005626463.localdomain podman[99929]: 2026-02-23 08:46:47.239368333 +0000 UTC m=+0.408208706 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:46:47 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:46:50 np0005626463.localdomain podman[100044]: 2026-02-23 08:46:50.923017988 +0000 UTC m=+0.093957003 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:46:50 np0005626463.localdomain podman[100044]: 2026-02-23 08:46:50.96733604 +0000 UTC m=+0.138275075 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: tmp-crun.cqLtvL.mount: Deactivated successfully.
Feb 23 08:46:50 np0005626463.localdomain podman[100044]: unhealthy
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:46:50 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:46:51 np0005626463.localdomain podman[100045]: 2026-02-23 08:46:51.025092125 +0000 UTC m=+0.192789709 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team)
Feb 23 08:46:51 np0005626463.localdomain podman[100045]: 2026-02-23 08:46:51.039137066 +0000 UTC m=+0.206834650 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:46:51 np0005626463.localdomain podman[100045]: unhealthy
Feb 23 08:46:51 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:46:51 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:46:51 np0005626463.localdomain podman[100046]: 2026-02-23 08:46:50.992985626 +0000 UTC m=+0.157307243 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:46:51 np0005626463.localdomain podman[100043]: 2026-02-23 08:46:51.180705484 +0000 UTC m=+0.354160708 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:46:51 np0005626463.localdomain podman[100046]: 2026-02-23 08:46:51.224463088 +0000 UTC m=+0.388784715 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=metrics_qdr)
Feb 23 08:46:51 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:46:51 np0005626463.localdomain podman[100043]: 2026-02-23 08:46:51.568545549 +0000 UTC m=+0.742000763 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:46:51 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:46:52 np0005626463.localdomain sudo[100133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:46:52 np0005626463.localdomain sudo[100133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:46:52 np0005626463.localdomain sudo[100133]: pam_unix(sudo:session): session closed for user root
Feb 23 08:46:52 np0005626463.localdomain sudo[100148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:46:52 np0005626463.localdomain sudo[100148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:46:52 np0005626463.localdomain sudo[100148]: pam_unix(sudo:session): session closed for user root
Feb 23 08:46:53 np0005626463.localdomain sudo[100195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:46:53 np0005626463.localdomain sudo[100195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:46:53 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:46:53 np0005626463.localdomain sudo[100195]: pam_unix(sudo:session): session closed for user root
Feb 23 08:46:53 np0005626463.localdomain recover_tripleo_nova_virtqemud[100211]: 61982
Feb 23 08:46:53 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:46:53 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:46:56 np0005626463.localdomain sshd[100212]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:46:56 np0005626463.localdomain sshd[100212]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 08:46:56 np0005626463.localdomain sshd[100212]: Connection closed by 64.89.160.135 port 58000
Feb 23 08:47:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:47:10 np0005626463.localdomain systemd[1]: tmp-crun.i1k5MN.mount: Deactivated successfully.
Feb 23 08:47:10 np0005626463.localdomain podman[100213]: 2026-02-23 08:47:10.943530014 +0000 UTC m=+0.113775565 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:47:10 np0005626463.localdomain podman[100213]: 2026-02-23 08:47:10.956625316 +0000 UTC m=+0.126870827 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:47:10 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:47:17 np0005626463.localdomain podman[100235]: 2026-02-23 08:47:17.937790572 +0000 UTC m=+0.099420394 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron)
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: tmp-crun.r8Bz9o.mount: Deactivated successfully.
Feb 23 08:47:17 np0005626463.localdomain podman[100235]: 2026-02-23 08:47:17.977292813 +0000 UTC m=+0.138922665 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible)
Feb 23 08:47:17 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:47:18 np0005626463.localdomain systemd[1]: tmp-crun.nXqDK5.mount: Deactivated successfully.
Feb 23 08:47:18 np0005626463.localdomain podman[100233]: 2026-02-23 08:47:18.034209671 +0000 UTC m=+0.201362797 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:47:18 np0005626463.localdomain podman[100241]: 2026-02-23 08:47:18.05104583 +0000 UTC m=+0.205704204 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.13, vcs-type=git)
Feb 23 08:47:18 np0005626463.localdomain podman[100233]: 2026-02-23 08:47:18.06536789 +0000 UTC m=+0.232521036 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:47:18 np0005626463.localdomain podman[100241]: 2026-02-23 08:47:18.075858759 +0000 UTC m=+0.230517203 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 23 08:47:18 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:47:18 np0005626463.localdomain podman[100234]: 2026-02-23 08:47:17.983689863 +0000 UTC m=+0.147975729 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git)
Feb 23 08:47:18 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:47:18 np0005626463.localdomain podman[100234]: 2026-02-23 08:47:18.118299973 +0000 UTC m=+0.282585839 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:30Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 23 08:47:18 np0005626463.localdomain podman[100232]: 2026-02-23 08:47:18.077340196 +0000 UTC m=+0.250409249 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Feb 23 08:47:18 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:47:18 np0005626463.localdomain podman[100232]: 2026-02-23 08:47:18.163370669 +0000 UTC m=+0.336439732 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.buildah.version=1.41.5)
Feb 23 08:47:18 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:47:20 np0005626463.localdomain sshd[100353]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:47:20 np0005626463.localdomain sshd[100353]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:47:21 np0005626463.localdomain sshd[100355]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:47:21 np0005626463.localdomain sshd[100355]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:47:21 np0005626463.localdomain podman[100365]: 2026-02-23 08:47:21.92138498 +0000 UTC m=+0.077039861 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git)
Feb 23 08:47:21 np0005626463.localdomain systemd[1]: tmp-crun.K1XcpV.mount: Deactivated successfully.
Feb 23 08:47:21 np0005626463.localdomain podman[100357]: 2026-02-23 08:47:21.978251186 +0000 UTC m=+0.144389547 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, container_name=nova_migration_target, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:47:22 np0005626463.localdomain podman[100358]: 2026-02-23 08:47:22.025965465 +0000 UTC m=+0.187178201 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5)
Feb 23 08:47:22 np0005626463.localdomain podman[100359]: 2026-02-23 08:47:21.95258989 +0000 UTC m=+0.108469289 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:47:22 np0005626463.localdomain podman[100358]: 2026-02-23 08:47:22.068105399 +0000 UTC m=+0.229318155 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:47:22 np0005626463.localdomain podman[100358]: unhealthy
Feb 23 08:47:22 np0005626463.localdomain podman[100359]: 2026-02-23 08:47:22.086426265 +0000 UTC m=+0.242305674 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:47:22 np0005626463.localdomain podman[100359]: unhealthy
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:47:22 np0005626463.localdomain podman[100365]: 2026-02-23 08:47:22.147493004 +0000 UTC m=+0.303147945 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:47:22 np0005626463.localdomain podman[100357]: 2026-02-23 08:47:22.363200101 +0000 UTC m=+0.529338502 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Feb 23 08:47:22 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:47:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:47:41 np0005626463.localdomain podman[100445]: 2026-02-23 08:47:41.917131215 +0000 UTC m=+0.091014698 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 23 08:47:41 np0005626463.localdomain podman[100445]: 2026-02-23 08:47:41.95732987 +0000 UTC m=+0.131213323 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team)
Feb 23 08:47:41 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:47:48 np0005626463.localdomain podman[100472]: 2026-02-23 08:47:48.936863091 +0000 UTC m=+0.097411960 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 23 08:47:48 np0005626463.localdomain podman[100472]: 2026-02-23 08:47:48.970725677 +0000 UTC m=+0.131274556 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: tmp-crun.Z33k5t.mount: Deactivated successfully.
Feb 23 08:47:48 np0005626463.localdomain podman[100468]: 2026-02-23 08:47:48.983898661 +0000 UTC m=+0.148482177 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 23 08:47:48 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:47:49 np0005626463.localdomain podman[100466]: 2026-02-23 08:47:49.024374817 +0000 UTC m=+0.196162860 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1)
Feb 23 08:47:49 np0005626463.localdomain podman[100481]: 2026-02-23 08:47:48.988159496 +0000 UTC m=+0.144760401 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:47:49 np0005626463.localdomain podman[100466]: 2026-02-23 08:47:49.06227916 +0000 UTC m=+0.234067213 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public)
Feb 23 08:47:49 np0005626463.localdomain podman[100481]: 2026-02-23 08:47:49.071279774 +0000 UTC m=+0.227880719 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:47:49 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:47:49 np0005626463.localdomain podman[100467]: 2026-02-23 08:47:49.079502472 +0000 UTC m=+0.247981520 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc.)
Feb 23 08:47:49 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:47:49 np0005626463.localdomain podman[100468]: 2026-02-23 08:47:49.090692045 +0000 UTC m=+0.255275631 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:47:49 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:47:49 np0005626463.localdomain podman[100467]: 2026-02-23 08:47:49.135360231 +0000 UTC m=+0.303839299 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 23 08:47:49 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:47:52 np0005626463.localdomain systemd[1]: tmp-crun.TDMUWJ.mount: Deactivated successfully.
Feb 23 08:47:52 np0005626463.localdomain podman[100579]: 2026-02-23 08:47:52.912990828 +0000 UTC m=+0.086737552 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:47:52 np0005626463.localdomain podman[100580]: 2026-02-23 08:47:52.984538991 +0000 UTC m=+0.156009684 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:47:53 np0005626463.localdomain podman[100580]: 2026-02-23 08:47:53.027495425 +0000 UTC m=+0.198966088 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:47:53 np0005626463.localdomain podman[100580]: unhealthy
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:47:53 np0005626463.localdomain podman[100581]: 2026-02-23 08:47:52.945795702 +0000 UTC m=+0.114363273 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, container_name=ovn_metadata_agent, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:47:53 np0005626463.localdomain podman[100582]: 2026-02-23 08:47:53.029023812 +0000 UTC m=+0.195190477 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:47:53 np0005626463.localdomain podman[100581]: 2026-02-23 08:47:53.082522067 +0000 UTC m=+0.251089728 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com)
Feb 23 08:47:53 np0005626463.localdomain podman[100581]: unhealthy
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:47:53 np0005626463.localdomain podman[100582]: 2026-02-23 08:47:53.274428932 +0000 UTC m=+0.440595597 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1)
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:47:53 np0005626463.localdomain podman[100579]: 2026-02-23 08:47:53.327123672 +0000 UTC m=+0.500870336 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:47:53 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:47:53 np0005626463.localdomain sudo[100671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:47:53 np0005626463.localdomain sudo[100671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:47:53 np0005626463.localdomain sudo[100671]: pam_unix(sudo:session): session closed for user root
Feb 23 08:47:53 np0005626463.localdomain sudo[100686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:47:53 np0005626463.localdomain sudo[100686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:47:54 np0005626463.localdomain sudo[100686]: pam_unix(sudo:session): session closed for user root
Feb 23 08:47:54 np0005626463.localdomain sudo[100732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:47:54 np0005626463.localdomain sudo[100732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:47:54 np0005626463.localdomain sudo[100732]: pam_unix(sudo:session): session closed for user root
Feb 23 08:48:00 np0005626463.localdomain sshd[100747]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:48:01 np0005626463.localdomain sshd[100747]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:48:06 np0005626463.localdomain sshd[100749]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:48:10 np0005626463.localdomain sshd[100749]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:48:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:48:12 np0005626463.localdomain systemd[1]: tmp-crun.SoYhGy.mount: Deactivated successfully.
Feb 23 08:48:12 np0005626463.localdomain podman[100751]: 2026-02-23 08:48:12.943751967 +0000 UTC m=+0.108172758 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 23 08:48:12 np0005626463.localdomain podman[100751]: 2026-02-23 08:48:12.9594136 +0000 UTC m=+0.123834351 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, version=17.1.13, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=)
Feb 23 08:48:12 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:48:19 np0005626463.localdomain systemd[1]: tmp-crun.nPw427.mount: Deactivated successfully.
Feb 23 08:48:19 np0005626463.localdomain podman[100774]: 2026-02-23 08:48:19.933603242 +0000 UTC m=+0.092342530 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510)
Feb 23 08:48:19 np0005626463.localdomain podman[100772]: 2026-02-23 08:48:19.98051963 +0000 UTC m=+0.148145487 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:48:19 np0005626463.localdomain podman[100771]: 2026-02-23 08:48:19.996503893 +0000 UTC m=+0.165528494 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:48:20 np0005626463.localdomain podman[100787]: 2026-02-23 08:48:19.953882951 +0000 UTC m=+0.102943113 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:48:20 np0005626463.localdomain podman[100771]: 2026-02-23 08:48:20.008637355 +0000 UTC m=+0.177661946 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:48:20 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:48:20 np0005626463.localdomain podman[100787]: 2026-02-23 08:48:20.036436951 +0000 UTC m=+0.185497133 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.13, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:48:20 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:48:20 np0005626463.localdomain podman[100773]: 2026-02-23 08:48:19.910473094 +0000 UTC m=+0.075459978 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 23 08:48:20 np0005626463.localdomain podman[100774]: 2026-02-23 08:48:20.065224717 +0000 UTC m=+0.223964035 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:48:20 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:48:20 np0005626463.localdomain podman[100772]: 2026-02-23 08:48:20.084642769 +0000 UTC m=+0.252268636 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:48:20 np0005626463.localdomain podman[100773]: 2026-02-23 08:48:20.096370478 +0000 UTC m=+0.261357382 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Feb 23 08:48:20 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:48:20 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:48:23 np0005626463.localdomain systemd[1]: tmp-crun.nKSRqe.mount: Deactivated successfully.
Feb 23 08:48:23 np0005626463.localdomain podman[100889]: 2026-02-23 08:48:23.947710257 +0000 UTC m=+0.099353200 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:48:23 np0005626463.localdomain podman[100890]: 2026-02-23 08:48:23.991843737 +0000 UTC m=+0.140816346 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr)
Feb 23 08:48:24 np0005626463.localdomain podman[100889]: 2026-02-23 08:48:24.045498887 +0000 UTC m=+0.197141770 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:48:24 np0005626463.localdomain podman[100889]: unhealthy
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:48:24 np0005626463.localdomain podman[100888]: 2026-02-23 08:48:24.046370354 +0000 UTC m=+0.202585461 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=)
Feb 23 08:48:24 np0005626463.localdomain podman[100887]: 2026-02-23 08:48:24.101791959 +0000 UTC m=+0.258236354 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:48:24 np0005626463.localdomain podman[100888]: 2026-02-23 08:48:24.126277921 +0000 UTC m=+0.282493058 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 23 08:48:24 np0005626463.localdomain podman[100888]: unhealthy
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:48:24 np0005626463.localdomain podman[100890]: 2026-02-23 08:48:24.206133115 +0000 UTC m=+0.355105724 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:48:24 np0005626463.localdomain podman[100887]: 2026-02-23 08:48:24.480356742 +0000 UTC m=+0.636801137 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510)
Feb 23 08:48:24 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:48:37 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:48:37 np0005626463.localdomain recover_tripleo_nova_virtqemud[100981]: 61982
Feb 23 08:48:37 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:48:37 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:48:39 np0005626463.localdomain sshd[100982]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:48:40 np0005626463.localdomain sshd[100982]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:48:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:48:43 np0005626463.localdomain systemd[1]: tmp-crun.9GoDjU.mount: Deactivated successfully.
Feb 23 08:48:43 np0005626463.localdomain podman[100984]: 2026-02-23 08:48:43.93763612 +0000 UTC m=+0.105332868 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 23 08:48:43 np0005626463.localdomain podman[100984]: 2026-02-23 08:48:43.987061717 +0000 UTC m=+0.154758495 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:48:44 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:48:50 np0005626463.localdomain podman[101004]: 2026-02-23 08:48:50.921159687 +0000 UTC m=+0.095006433 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public)
Feb 23 08:48:50 np0005626463.localdomain podman[101004]: 2026-02-23 08:48:50.937270084 +0000 UTC m=+0.111116860 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public)
Feb 23 08:48:50 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:48:50 np0005626463.localdomain podman[101007]: 2026-02-23 08:48:50.984553893 +0000 UTC m=+0.145110411 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510)
Feb 23 08:48:51 np0005626463.localdomain podman[101007]: 2026-02-23 08:48:51.028298441 +0000 UTC m=+0.188854959 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 23 08:48:51 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:48:51 np0005626463.localdomain podman[101005]: 2026-02-23 08:48:51.081881908 +0000 UTC m=+0.250269902 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 23 08:48:51 np0005626463.localdomain podman[101006]: 2026-02-23 08:48:51.034357512 +0000 UTC m=+0.200175576 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:48:51 np0005626463.localdomain podman[101006]: 2026-02-23 08:48:51.115327842 +0000 UTC m=+0.281145896 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:48:51 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:48:51 np0005626463.localdomain podman[101012]: 2026-02-23 08:48:51.134652871 +0000 UTC m=+0.294369122 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64)
Feb 23 08:48:51 np0005626463.localdomain podman[101005]: 2026-02-23 08:48:51.164196632 +0000 UTC m=+0.332584596 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, distribution-scope=public, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:48:51 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:48:51 np0005626463.localdomain podman[101012]: 2026-02-23 08:48:51.217698806 +0000 UTC m=+0.377415017 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13)
Feb 23 08:48:51 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:48:51 np0005626463.localdomain systemd[1]: tmp-crun.FfK6ZX.mount: Deactivated successfully.
Feb 23 08:48:53 np0005626463.localdomain sshd[101124]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:48:54 np0005626463.localdomain sshd[101124]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: tmp-crun.gGYpGU.mount: Deactivated successfully.
Feb 23 08:48:54 np0005626463.localdomain podman[101126]: 2026-02-23 08:48:54.446119155 +0000 UTC m=+0.087436735 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=)
Feb 23 08:48:54 np0005626463.localdomain podman[101126]: 2026-02-23 08:48:54.463612285 +0000 UTC m=+0.104929945 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: tmp-crun.V6bix7.mount: Deactivated successfully.
Feb 23 08:48:54 np0005626463.localdomain podman[101127]: 2026-02-23 08:48:54.506380263 +0000 UTC m=+0.144187603 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:48:54 np0005626463.localdomain podman[101128]: 2026-02-23 08:48:54.550244964 +0000 UTC m=+0.184403059 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public)
Feb 23 08:48:54 np0005626463.localdomain podman[101126]: unhealthy
Feb 23 08:48:54 np0005626463.localdomain podman[101127]: 2026-02-23 08:48:54.574018263 +0000 UTC m=+0.211825553 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:48:54 np0005626463.localdomain podman[101127]: unhealthy
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:48:54 np0005626463.localdomain podman[101177]: 2026-02-23 08:48:54.633491776 +0000 UTC m=+0.094739445 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:48:54 np0005626463.localdomain podman[101128]: 2026-02-23 08:48:54.769423277 +0000 UTC m=+0.403581332 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:48:54 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:48:55 np0005626463.localdomain podman[101177]: 2026-02-23 08:48:55.007898798 +0000 UTC m=+0.469146527 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git)
Feb 23 08:48:55 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:48:55 np0005626463.localdomain sudo[101221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:48:55 np0005626463.localdomain sudo[101221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:48:55 np0005626463.localdomain sudo[101221]: pam_unix(sudo:session): session closed for user root
Feb 23 08:48:55 np0005626463.localdomain sudo[101236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:48:55 np0005626463.localdomain sudo[101236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:48:56 np0005626463.localdomain sudo[101236]: pam_unix(sudo:session): session closed for user root
Feb 23 08:48:56 np0005626463.localdomain sudo[101282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:48:56 np0005626463.localdomain sudo[101282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:48:56 np0005626463.localdomain sudo[101282]: pam_unix(sudo:session): session closed for user root
Feb 23 08:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:49:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:49:14 np0005626463.localdomain systemd[1]: tmp-crun.AUFeYf.mount: Deactivated successfully.
Feb 23 08:49:14 np0005626463.localdomain podman[101297]: 2026-02-23 08:49:14.929937253 +0000 UTC m=+0.104252715 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z)
Feb 23 08:49:14 np0005626463.localdomain podman[101297]: 2026-02-23 08:49:14.971359457 +0000 UTC m=+0.145674879 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible)
Feb 23 08:49:14 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:49:19 np0005626463.localdomain sshd[101317]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:49:19 np0005626463.localdomain sshd[101317]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:49:21 np0005626463.localdomain systemd[1]: tmp-crun.pvymrP.mount: Deactivated successfully.
Feb 23 08:49:21 np0005626463.localdomain podman[101319]: 2026-02-23 08:49:21.916507085 +0000 UTC m=+0.090312456 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:49:21 np0005626463.localdomain podman[101327]: 2026-02-23 08:49:21.983091862 +0000 UTC m=+0.138839974 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Feb 23 08:49:22 np0005626463.localdomain podman[101328]: 2026-02-23 08:49:22.0459098 +0000 UTC m=+0.200721322 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5)
Feb 23 08:49:22 np0005626463.localdomain podman[101319]: 2026-02-23 08:49:22.05256666 +0000 UTC m=+0.226372071 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team)
Feb 23 08:49:22 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:49:22 np0005626463.localdomain podman[101328]: 2026-02-23 08:49:22.074379616 +0000 UTC m=+0.229191158 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13)
Feb 23 08:49:22 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:49:22 np0005626463.localdomain podman[101320]: 2026-02-23 08:49:22.091363981 +0000 UTC m=+0.256825339 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:49:22 np0005626463.localdomain podman[101321]: 2026-02-23 08:49:21.951687063 +0000 UTC m=+0.113270989 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true)
Feb 23 08:49:22 np0005626463.localdomain podman[101327]: 2026-02-23 08:49:22.116753262 +0000 UTC m=+0.272501394 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git)
Feb 23 08:49:22 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:49:22 np0005626463.localdomain podman[101321]: 2026-02-23 08:49:22.135296125 +0000 UTC m=+0.296880001 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 23 08:49:22 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:49:22 np0005626463.localdomain podman[101320]: 2026-02-23 08:49:22.151364942 +0000 UTC m=+0.316826270 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 23 08:49:22 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:49:24 np0005626463.localdomain podman[101431]: 2026-02-23 08:49:24.917903713 +0000 UTC m=+0.092351260 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: tmp-crun.tmrniE.mount: Deactivated successfully.
Feb 23 08:49:24 np0005626463.localdomain podman[101433]: 2026-02-23 08:49:24.979132061 +0000 UTC m=+0.148451036 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:49:24 np0005626463.localdomain podman[101431]: 2026-02-23 08:49:24.986913677 +0000 UTC m=+0.161361114 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible)
Feb 23 08:49:24 np0005626463.localdomain podman[101431]: unhealthy
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:49:24 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:49:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:49:25 np0005626463.localdomain podman[101432]: 2026-02-23 08:49:25.085208462 +0000 UTC m=+0.254056023 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, tcib_managed=true, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 23 08:49:25 np0005626463.localdomain podman[101432]: 2026-02-23 08:49:25.130565551 +0000 UTC m=+0.299413102 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 23 08:49:25 np0005626463.localdomain podman[101432]: unhealthy
Feb 23 08:49:25 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:49:25 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:49:25 np0005626463.localdomain podman[101433]: 2026-02-23 08:49:25.169312651 +0000 UTC m=+0.338631636 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:49:25 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:49:25 np0005626463.localdomain podman[101500]: 2026-02-23 08:49:25.181091262 +0000 UTC m=+0.089497450 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_migration_target)
Feb 23 08:49:25 np0005626463.localdomain podman[101500]: 2026-02-23 08:49:25.592496319 +0000 UTC m=+0.500902507 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:49:25 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:49:40 np0005626463.localdomain sshd[101525]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:49:41 np0005626463.localdomain sshd[101525]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:49:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:49:45 np0005626463.localdomain systemd[1]: tmp-crun.NXyRKd.mount: Deactivated successfully.
Feb 23 08:49:45 np0005626463.localdomain podman[101527]: 2026-02-23 08:49:45.918407163 +0000 UTC m=+0.090100415 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510)
Feb 23 08:49:45 np0005626463.localdomain podman[101527]: 2026-02-23 08:49:45.928403216 +0000 UTC m=+0.100096498 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, release=1766032510, batch=17.1_20260112.1)
Feb 23 08:49:45 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:49:52 np0005626463.localdomain podman[101548]: 2026-02-23 08:49:52.945070046 +0000 UTC m=+0.117296108 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1)
Feb 23 08:49:52 np0005626463.localdomain podman[101548]: 2026-02-23 08:49:52.954079029 +0000 UTC m=+0.126305091 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com)
Feb 23 08:49:52 np0005626463.localdomain podman[101549]: 2026-02-23 08:49:52.917127011 +0000 UTC m=+0.089126496 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Feb 23 08:49:52 np0005626463.localdomain systemd[1]: tmp-crun.nZ17D1.mount: Deactivated successfully.
Feb 23 08:49:52 np0005626463.localdomain podman[101551]: 2026-02-23 08:49:52.985801394 +0000 UTC m=+0.145034988 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, release=1766032510, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 23 08:49:52 np0005626463.localdomain podman[101551]: 2026-02-23 08:49:52.993889357 +0000 UTC m=+0.153122971 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4)
Feb 23 08:49:53 np0005626463.localdomain podman[101549]: 2026-02-23 08:49:53.003541939 +0000 UTC m=+0.175541474 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:49:53 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:49:53 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:49:53 np0005626463.localdomain podman[101550]: 2026-02-23 08:49:53.085960813 +0000 UTC m=+0.251584187 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 23 08:49:53 np0005626463.localdomain podman[101550]: 2026-02-23 08:49:53.122272411 +0000 UTC m=+0.287895825 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public)
Feb 23 08:49:53 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:49:53 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:49:53 np0005626463.localdomain podman[101562]: 2026-02-23 08:49:53.20072939 +0000 UTC m=+0.357446805 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:49:53 np0005626463.localdomain podman[101562]: 2026-02-23 08:49:53.257440719 +0000 UTC m=+0.414158104 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 23 08:49:53 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:49:55 np0005626463.localdomain podman[101666]: 2026-02-23 08:49:55.92379729 +0000 UTC m=+0.090037804 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible)
Feb 23 08:49:55 np0005626463.localdomain podman[101668]: 2026-02-23 08:49:55.973514598 +0000 UTC m=+0.132817914 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5)
Feb 23 08:49:55 np0005626463.localdomain podman[101666]: 2026-02-23 08:49:55.992779702 +0000 UTC m=+0.159020246 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible)
Feb 23 08:49:55 np0005626463.localdomain podman[101666]: unhealthy
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: tmp-crun.MWdEIR.mount: Deactivated successfully.
Feb 23 08:49:56 np0005626463.localdomain podman[101667]: 2026-02-23 08:49:56.051098471 +0000 UTC m=+0.213163603 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:49:56 np0005626463.localdomain podman[101667]: 2026-02-23 08:49:56.065251884 +0000 UTC m=+0.227317046 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1)
Feb 23 08:49:56 np0005626463.localdomain podman[101667]: unhealthy
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:49:56 np0005626463.localdomain podman[101665]: 2026-02-23 08:49:56.142265038 +0000 UTC m=+0.312001251 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true)
Feb 23 08:49:56 np0005626463.localdomain podman[101668]: 2026-02-23 08:49:56.200725101 +0000 UTC m=+0.360028417 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:49:56 np0005626463.localdomain podman[101665]: 2026-02-23 08:49:56.554939585 +0000 UTC m=+0.724675768 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20260112.1)
Feb 23 08:49:56 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:49:56 np0005626463.localdomain sshd[101757]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:49:56 np0005626463.localdomain sudo[101758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:49:56 np0005626463.localdomain sudo[101758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:49:56 np0005626463.localdomain sudo[101758]: pam_unix(sudo:session): session closed for user root
Feb 23 08:49:56 np0005626463.localdomain sudo[101774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:49:56 np0005626463.localdomain sudo[101774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:49:57 np0005626463.localdomain sshd[101757]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:49:57 np0005626463.localdomain sudo[101774]: pam_unix(sudo:session): session closed for user root
Feb 23 08:50:00 np0005626463.localdomain sudo[101820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:50:00 np0005626463.localdomain sudo[101820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:50:00 np0005626463.localdomain sudo[101820]: pam_unix(sudo:session): session closed for user root
Feb 23 08:50:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:50:16 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:50:16 np0005626463.localdomain recover_tripleo_nova_virtqemud[101837]: 61982
Feb 23 08:50:16 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:50:16 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:50:16 np0005626463.localdomain podman[101835]: 2026-02-23 08:50:16.934187549 +0000 UTC m=+0.100742629 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.5, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:50:16 np0005626463.localdomain podman[101835]: 2026-02-23 08:50:16.974305907 +0000 UTC m=+0.140860957 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 23 08:50:16 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:50:23 np0005626463.localdomain podman[101860]: 2026-02-23 08:50:23.933059082 +0000 UTC m=+0.093002876 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, batch=17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, url=https://www.redhat.com)
Feb 23 08:50:23 np0005626463.localdomain podman[101860]: 2026-02-23 08:50:23.944601624 +0000 UTC m=+0.104545458 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:50:23 np0005626463.localdomain systemd[1]: tmp-crun.IvMKvH.mount: Deactivated successfully.
Feb 23 08:50:23 np0005626463.localdomain podman[101859]: 2026-02-23 08:50:23.991047629 +0000 UTC m=+0.155708361 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=)
Feb 23 08:50:24 np0005626463.localdomain podman[101871]: 2026-02-23 08:50:24.025135309 +0000 UTC m=+0.180730547 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:50:24 np0005626463.localdomain podman[101857]: 2026-02-23 08:50:24.04496211 +0000 UTC m=+0.215113874 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, version=17.1.13)
Feb 23 08:50:24 np0005626463.localdomain podman[101859]: 2026-02-23 08:50:24.052713772 +0000 UTC m=+0.217374524 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 08:50:24 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:50:24 np0005626463.localdomain podman[101857]: 2026-02-23 08:50:24.081595138 +0000 UTC m=+0.251746912 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 23 08:50:24 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:50:24 np0005626463.localdomain podman[101871]: 2026-02-23 08:50:24.134846908 +0000 UTC m=+0.290442146 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com)
Feb 23 08:50:24 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:50:24 np0005626463.localdomain podman[101858]: 2026-02-23 08:50:24.1406728 +0000 UTC m=+0.308098089 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:50:24 np0005626463.localdomain podman[101858]: 2026-02-23 08:50:24.221987419 +0000 UTC m=+0.389412678 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Feb 23 08:50:24 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:50:26 np0005626463.localdomain podman[101973]: 2026-02-23 08:50:26.919168378 +0000 UTC m=+0.090229241 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:50:26 np0005626463.localdomain podman[101973]: 2026-02-23 08:50:26.96422981 +0000 UTC m=+0.135290693 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:50:26 np0005626463.localdomain podman[101973]: unhealthy
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: tmp-crun.IDSr7w.mount: Deactivated successfully.
Feb 23 08:50:26 np0005626463.localdomain podman[101974]: 2026-02-23 08:50:26.980546772 +0000 UTC m=+0.149252361 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc.)
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:50:26 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:50:27 np0005626463.localdomain podman[101974]: 2026-02-23 08:50:27.019963707 +0000 UTC m=+0.188669276 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:50:27 np0005626463.localdomain podman[101974]: unhealthy
Feb 23 08:50:27 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:50:27 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:50:27 np0005626463.localdomain podman[101975]: 2026-02-23 08:50:27.034390539 +0000 UTC m=+0.198408680 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc.)
Feb 23 08:50:27 np0005626463.localdomain podman[101972]: 2026-02-23 08:50:27.073498265 +0000 UTC m=+0.247004033 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible)
Feb 23 08:50:27 np0005626463.localdomain podman[101975]: 2026-02-23 08:50:27.24943417 +0000 UTC m=+0.413452271 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 23 08:50:27 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:50:27 np0005626463.localdomain podman[101972]: 2026-02-23 08:50:27.481443712 +0000 UTC m=+0.654949440 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:50:27 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:50:27 np0005626463.localdomain sshd[102065]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:50:28 np0005626463.localdomain sshd[102065]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:50:37 np0005626463.localdomain sshd[102067]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:50:37 np0005626463.localdomain sshd[102067]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:50:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:50:47 np0005626463.localdomain podman[102069]: 2026-02-23 08:50:47.909553321 +0000 UTC m=+0.080595988 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:50:47 np0005626463.localdomain podman[102069]: 2026-02-23 08:50:47.949419 +0000 UTC m=+0.120461667 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:50:47 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:50:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:50:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:50:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:50:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:50:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:50:54 np0005626463.localdomain podman[102092]: 2026-02-23 08:50:54.934985575 +0000 UTC m=+0.094866905 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 23 08:50:54 np0005626463.localdomain podman[102092]: 2026-02-23 08:50:54.970158378 +0000 UTC m=+0.130039728 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:50:55 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:50:55 np0005626463.localdomain podman[102089]: 2026-02-23 08:50:55.021509627 +0000 UTC m=+0.191187504 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z)
Feb 23 08:50:55 np0005626463.localdomain podman[102089]: 2026-02-23 08:50:55.036215818 +0000 UTC m=+0.205893755 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., container_name=iscsid, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:50:55 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:50:55 np0005626463.localdomain podman[102090]: 2026-02-23 08:50:55.036628102 +0000 UTC m=+0.202960794 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team)
Feb 23 08:50:55 np0005626463.localdomain podman[102091]: 2026-02-23 08:50:55.093536305 +0000 UTC m=+0.255520760 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:50:55 np0005626463.localdomain podman[102097]: 2026-02-23 08:50:55.145198204 +0000 UTC m=+0.301803591 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, vcs-type=git)
Feb 23 08:50:55 np0005626463.localdomain podman[102091]: 2026-02-23 08:50:55.15559401 +0000 UTC m=+0.317578505 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 23 08:50:55 np0005626463.localdomain podman[102090]: 2026-02-23 08:50:55.167672679 +0000 UTC m=+0.334005331 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1)
Feb 23 08:50:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:50:55 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:50:55 np0005626463.localdomain podman[102097]: 2026-02-23 08:50:55.181537043 +0000 UTC m=+0.338142470 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:50:55 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:50:57 np0005626463.localdomain podman[102207]: 2026-02-23 08:50:57.928289976 +0000 UTC m=+0.100173152 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 23 08:50:58 np0005626463.localdomain podman[102210]: 2026-02-23 08:50:58.006946851 +0000 UTC m=+0.164598961 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public)
Feb 23 08:50:58 np0005626463.localdomain podman[102209]: 2026-02-23 08:50:57.974397681 +0000 UTC m=+0.138731430 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z)
Feb 23 08:50:58 np0005626463.localdomain podman[102208]: 2026-02-23 08:50:58.056368921 +0000 UTC m=+0.221016929 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4)
Feb 23 08:50:58 np0005626463.localdomain podman[102208]: 2026-02-23 08:50:58.099362478 +0000 UTC m=+0.264010456 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 23 08:50:58 np0005626463.localdomain podman[102208]: unhealthy
Feb 23 08:50:58 np0005626463.localdomain podman[102209]: 2026-02-23 08:50:58.108797984 +0000 UTC m=+0.273131723 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, version=17.1.13, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:50:58 np0005626463.localdomain podman[102209]: unhealthy
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:50:58 np0005626463.localdomain podman[102210]: 2026-02-23 08:50:58.214197428 +0000 UTC m=+0.371849488 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public)
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:50:58 np0005626463.localdomain podman[102207]: 2026-02-23 08:50:58.363544509 +0000 UTC m=+0.535427735 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target)
Feb 23 08:50:58 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:51:00 np0005626463.localdomain sudo[102296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:51:00 np0005626463.localdomain sudo[102296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:51:00 np0005626463.localdomain sudo[102296]: pam_unix(sudo:session): session closed for user root
Feb 23 08:51:00 np0005626463.localdomain sudo[102311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 08:51:00 np0005626463.localdomain sudo[102311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:51:01 np0005626463.localdomain sudo[102311]: pam_unix(sudo:session): session closed for user root
Feb 23 08:51:01 np0005626463.localdomain sudo[102347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:51:01 np0005626463.localdomain sudo[102347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:51:01 np0005626463.localdomain sudo[102347]: pam_unix(sudo:session): session closed for user root
Feb 23 08:51:01 np0005626463.localdomain sudo[102362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:51:01 np0005626463.localdomain sudo[102362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:51:02 np0005626463.localdomain sudo[102362]: pam_unix(sudo:session): session closed for user root
Feb 23 08:51:02 np0005626463.localdomain sudo[102408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:51:02 np0005626463.localdomain sudo[102408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:51:02 np0005626463.localdomain sudo[102408]: pam_unix(sudo:session): session closed for user root
Feb 23 08:51:17 np0005626463.localdomain sshd[102423]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:51:18 np0005626463.localdomain sshd[102423]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:51:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:51:18 np0005626463.localdomain systemd[1]: tmp-crun.H0laqn.mount: Deactivated successfully.
Feb 23 08:51:18 np0005626463.localdomain podman[102425]: 2026-02-23 08:51:18.460461564 +0000 UTC m=+0.087945647 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:51:18 np0005626463.localdomain podman[102425]: 2026-02-23 08:51:18.503468222 +0000 UTC m=+0.130952285 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:51:18 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:51:19 np0005626463.localdomain sshd[102445]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:51:19 np0005626463.localdomain sshd[102445]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:51:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:51:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:51:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:51:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:51:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:51:25 np0005626463.localdomain podman[102447]: 2026-02-23 08:51:25.983492609 +0000 UTC m=+0.149758826 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, container_name=iscsid, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-iscsid-container)
Feb 23 08:51:26 np0005626463.localdomain podman[102447]: 2026-02-23 08:51:26.01862956 +0000 UTC m=+0.184895737 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:51:26 np0005626463.localdomain podman[102450]: 2026-02-23 08:51:26.032745142 +0000 UTC m=+0.191275297 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-cron-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public)
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:51:26 np0005626463.localdomain podman[102448]: 2026-02-23 08:51:26.098594437 +0000 UTC m=+0.265427302 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64)
Feb 23 08:51:26 np0005626463.localdomain podman[102451]: 2026-02-23 08:51:25.952038762 +0000 UTC m=+0.106278743 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Feb 23 08:51:26 np0005626463.localdomain podman[102450]: 2026-02-23 08:51:26.117634803 +0000 UTC m=+0.276165008 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5)
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:51:26 np0005626463.localdomain podman[102451]: 2026-02-23 08:51:26.13733025 +0000 UTC m=+0.291570211 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true)
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:51:26 np0005626463.localdomain podman[102449]: 2026-02-23 08:51:26.188464154 +0000 UTC m=+0.351033326 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:07:30Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1)
Feb 23 08:51:26 np0005626463.localdomain podman[102448]: 2026-02-23 08:51:26.211701332 +0000 UTC m=+0.378534217 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:51:26 np0005626463.localdomain podman[102449]: 2026-02-23 08:51:26.244958554 +0000 UTC m=+0.407527716 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:51:26 np0005626463.localdomain systemd[1]: tmp-crun.LiCk7Y.mount: Deactivated successfully.
Feb 23 08:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:51:28 np0005626463.localdomain podman[102570]: 2026-02-23 08:51:28.929126284 +0000 UTC m=+0.090617691 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:51:28 np0005626463.localdomain systemd[1]: tmp-crun.S2K9cQ.mount: Deactivated successfully.
Feb 23 08:51:28 np0005626463.localdomain podman[102568]: 2026-02-23 08:51:28.988898528 +0000 UTC m=+0.156042892 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, tcib_managed=true, url=https://www.redhat.com)
Feb 23 08:51:29 np0005626463.localdomain podman[102569]: 2026-02-23 08:51:29.027034314 +0000 UTC m=+0.189260385 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:51:29 np0005626463.localdomain podman[102569]: 2026-02-23 08:51:29.042049284 +0000 UTC m=+0.204275335 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4)
Feb 23 08:51:29 np0005626463.localdomain podman[102569]: unhealthy
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:51:29 np0005626463.localdomain podman[102568]: 2026-02-23 08:51:29.057808928 +0000 UTC m=+0.224953272 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:51:29 np0005626463.localdomain podman[102568]: unhealthy
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:51:29 np0005626463.localdomain podman[102567]: 2026-02-23 08:51:29.129517696 +0000 UTC m=+0.299072246 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:51:29 np0005626463.localdomain podman[102570]: 2026-02-23 08:51:29.164707339 +0000 UTC m=+0.326198736 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:51:29 np0005626463.localdomain podman[102567]: 2026-02-23 08:51:29.527136321 +0000 UTC m=+0.696690921 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z)
Feb 23 08:51:29 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:51:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:51:48 np0005626463.localdomain podman[102659]: 2026-02-23 08:51:48.921496783 +0000 UTC m=+0.092137779 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Feb 23 08:51:48 np0005626463.localdomain podman[102659]: 2026-02-23 08:51:48.960287278 +0000 UTC m=+0.130928264 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 23 08:51:48 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: Starting dnf makecache...
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: tmp-crun.Ha1P52.mount: Deactivated successfully.
Feb 23 08:51:56 np0005626463.localdomain podman[102681]: 2026-02-23 08:51:56.93597607 +0000 UTC m=+0.102399692 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:51:56 np0005626463.localdomain podman[102681]: 2026-02-23 08:51:56.987302144 +0000 UTC m=+0.153725826 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:51:56 np0005626463.localdomain systemd[1]: tmp-crun.mYwtXG.mount: Deactivated successfully.
Feb 23 08:51:57 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:51:57 np0005626463.localdomain podman[102680]: 2026-02-23 08:51:57.048308275 +0000 UTC m=+0.216203263 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=iscsid)
Feb 23 08:51:57 np0005626463.localdomain podman[102683]: 2026-02-23 08:51:57.079737509 +0000 UTC m=+0.238398013 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:51:57 np0005626463.localdomain podman[102680]: 2026-02-23 08:51:57.087253219 +0000 UTC m=+0.255148177 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Feb 23 08:51:57 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:51:57 np0005626463.localdomain podman[102682]: 2026-02-23 08:51:57.000521389 +0000 UTC m=+0.161276927 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Feb 23 08:51:57 np0005626463.localdomain podman[102683]: 2026-02-23 08:51:57.111075649 +0000 UTC m=+0.269736143 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, version=17.1.13)
Feb 23 08:51:57 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:51:57 np0005626463.localdomain podman[102682]: 2026-02-23 08:51:57.135374795 +0000 UTC m=+0.296130343 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64)
Feb 23 08:51:57 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:51:57 np0005626463.localdomain dnf[102695]: Updating Subscription Management repositories.
Feb 23 08:51:57 np0005626463.localdomain podman[102690]: 2026-02-23 08:51:57.091118037 +0000 UTC m=+0.245304084 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:51:57 np0005626463.localdomain podman[102690]: 2026-02-23 08:51:57.224717635 +0000 UTC m=+0.378903612 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:51:57 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:51:58 np0005626463.localdomain dnf[102695]: Metadata cache refreshed recently.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: Finished dnf makecache.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: dnf-makecache.service: Consumed 2.311s CPU time.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:51:59 np0005626463.localdomain podman[102795]: 2026-02-23 08:51:59.343489085 +0000 UTC m=+0.076124516 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4)
Feb 23 08:51:59 np0005626463.localdomain podman[102794]: 2026-02-23 08:51:59.408294502 +0000 UTC m=+0.139923562 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible)
Feb 23 08:51:59 np0005626463.localdomain podman[102796]: 2026-02-23 08:51:59.373927418 +0000 UTC m=+0.100232935 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:51:59 np0005626463.localdomain podman[102795]: 2026-02-23 08:51:59.430616437 +0000 UTC m=+0.163251788 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 23 08:51:59 np0005626463.localdomain podman[102795]: unhealthy
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:51:59 np0005626463.localdomain podman[102794]: 2026-02-23 08:51:59.453275342 +0000 UTC m=+0.184904422 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:51:59 np0005626463.localdomain podman[102794]: unhealthy
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:51:59 np0005626463.localdomain podman[102796]: 2026-02-23 08:51:59.589419217 +0000 UTC m=+0.315724724 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:51:59 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:51:59 np0005626463.localdomain podman[102863]: 2026-02-23 08:51:59.68897998 +0000 UTC m=+0.070770281 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1)
Feb 23 08:52:00 np0005626463.localdomain podman[102863]: 2026-02-23 08:52:00.10536118 +0000 UTC m=+0.487151481 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:52:00 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:52:01 np0005626463.localdomain sshd[102886]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:52:02 np0005626463.localdomain sshd[102888]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:52:02 np0005626463.localdomain sshd[102888]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:52:02 np0005626463.localdomain sudo[102890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:52:02 np0005626463.localdomain sudo[102890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:52:02 np0005626463.localdomain sudo[102890]: pam_unix(sudo:session): session closed for user root
Feb 23 08:52:03 np0005626463.localdomain sudo[102905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:52:03 np0005626463.localdomain sudo[102905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:52:03 np0005626463.localdomain sudo[102905]: pam_unix(sudo:session): session closed for user root
Feb 23 08:52:04 np0005626463.localdomain sudo[102952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:52:04 np0005626463.localdomain sudo[102952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:52:04 np0005626463.localdomain sudo[102952]: pam_unix(sudo:session): session closed for user root
Feb 23 08:52:05 np0005626463.localdomain sshd[102886]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:52:17 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:52:17 np0005626463.localdomain recover_tripleo_nova_virtqemud[102968]: 61982
Feb 23 08:52:17 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:52:17 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:52:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:52:19 np0005626463.localdomain podman[102969]: 2026-02-23 08:52:19.926041376 +0000 UTC m=+0.095507080 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:52:19 np0005626463.localdomain podman[102969]: 2026-02-23 08:52:19.936823737 +0000 UTC m=+0.106289401 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:52:19 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:52:27 np0005626463.localdomain podman[102989]: 2026-02-23 08:52:27.929172701 +0000 UTC m=+0.101341179 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z)
Feb 23 08:52:27 np0005626463.localdomain podman[102989]: 2026-02-23 08:52:27.943287283 +0000 UTC m=+0.115455781 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:52:27 np0005626463.localdomain systemd[1]: tmp-crun.h4aXIi.mount: Deactivated successfully.
Feb 23 08:52:27 np0005626463.localdomain podman[102990]: 2026-02-23 08:52:27.989854762 +0000 UTC m=+0.158005447 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:52:28 np0005626463.localdomain podman[102998]: 2026-02-23 08:52:28.090777807 +0000 UTC m=+0.248670208 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:52:28 np0005626463.localdomain podman[102990]: 2026-02-23 08:52:28.095332996 +0000 UTC m=+0.263483681 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:52:28 np0005626463.localdomain podman[102992]: 2026-02-23 08:52:28.10588394 +0000 UTC m=+0.266633148 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:52:28 np0005626463.localdomain podman[102992]: 2026-02-23 08:52:28.118340883 +0000 UTC m=+0.279090031 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:52:28 np0005626463.localdomain podman[102998]: 2026-02-23 08:52:28.127341868 +0000 UTC m=+0.285234239 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:52:28 np0005626463.localdomain podman[102991]: 2026-02-23 08:52:28.07330404 +0000 UTC m=+0.237324818 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true)
Feb 23 08:52:28 np0005626463.localdomain podman[102991]: 2026-02-23 08:52:28.207481366 +0000 UTC m=+0.371502124 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git)
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:52:28 np0005626463.localdomain systemd[1]: tmp-crun.dvkcaG.mount: Deactivated successfully.
Feb 23 08:52:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:52:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:52:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:52:29 np0005626463.localdomain podman[103103]: 2026-02-23 08:52:29.90609306 +0000 UTC m=+0.078179499 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 23 08:52:29 np0005626463.localdomain podman[103102]: 2026-02-23 08:52:29.958991313 +0000 UTC m=+0.132962010 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:52:29 np0005626463.localdomain podman[103103]: 2026-02-23 08:52:29.953437122 +0000 UTC m=+0.125523521 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:52:29 np0005626463.localdomain podman[103102]: 2026-02-23 08:52:29.980216413 +0000 UTC m=+0.154187060 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container)
Feb 23 08:52:29 np0005626463.localdomain podman[103102]: unhealthy
Feb 23 08:52:29 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:52:29 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:52:30 np0005626463.localdomain podman[103103]: unhealthy
Feb 23 08:52:30 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:52:30 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:52:30 np0005626463.localdomain podman[103104]: 2026-02-23 08:52:30.081042235 +0000 UTC m=+0.246013986 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 23 08:52:30 np0005626463.localdomain podman[103104]: 2026-02-23 08:52:30.277415238 +0000 UTC m=+0.442386979 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:52:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:52:30 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:52:30 np0005626463.localdomain podman[103169]: 2026-02-23 08:52:30.381719526 +0000 UTC m=+0.077614600 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com)
Feb 23 08:52:30 np0005626463.localdomain podman[103169]: 2026-02-23 08:52:30.780709233 +0000 UTC m=+0.476604297 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:52:30 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:52:43 np0005626463.localdomain sshd[103193]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:52:43 np0005626463.localdomain sshd[103193]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:52:49 np0005626463.localdomain sshd[103195]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:52:50 np0005626463.localdomain sshd[103195]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:52:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:52:50 np0005626463.localdomain podman[103197]: 2026-02-23 08:52:50.536177946 +0000 UTC m=+0.097425289 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:52:50 np0005626463.localdomain podman[103197]: 2026-02-23 08:52:50.551385452 +0000 UTC m=+0.112632835 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 08:52:50 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:52:58 np0005626463.localdomain systemd[1]: tmp-crun.VMa3gl.mount: Deactivated successfully.
Feb 23 08:52:58 np0005626463.localdomain podman[103218]: 2026-02-23 08:52:58.966936119 +0000 UTC m=+0.102733771 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:52:59 np0005626463.localdomain podman[103217]: 2026-02-23 08:52:59.017427638 +0000 UTC m=+0.155205551 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:52:59 np0005626463.localdomain podman[103218]: 2026-02-23 08:52:59.018279754 +0000 UTC m=+0.154077396 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:52:59 np0005626463.localdomain podman[103229]: 2026-02-23 08:52:59.075378595 +0000 UTC m=+0.196790716 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Feb 23 08:52:59 np0005626463.localdomain podman[103220]: 2026-02-23 08:52:59.123448049 +0000 UTC m=+0.250426390 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron)
Feb 23 08:52:59 np0005626463.localdomain podman[103220]: 2026-02-23 08:52:59.138378978 +0000 UTC m=+0.265357369 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, architecture=x86_64)
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:52:59 np0005626463.localdomain podman[103217]: 2026-02-23 08:52:59.153751999 +0000 UTC m=+0.291529932 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, version=17.1.13, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:52:59 np0005626463.localdomain podman[103229]: 2026-02-23 08:52:59.180618763 +0000 UTC m=+0.302030834 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z)
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:52:59 np0005626463.localdomain podman[103219]: 2026-02-23 08:52:59.215997468 +0000 UTC m=+0.345889939 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4)
Feb 23 08:52:59 np0005626463.localdomain podman[103219]: 2026-02-23 08:52:59.276436882 +0000 UTC m=+0.406329343 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:52:59 np0005626463.localdomain systemd[1]: tmp-crun.z7zB3M.mount: Deactivated successfully.
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: tmp-crun.6q3YG5.mount: Deactivated successfully.
Feb 23 08:53:00 np0005626463.localdomain podman[103333]: 2026-02-23 08:53:00.972005832 +0000 UTC m=+0.143530082 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:53:00 np0005626463.localdomain systemd[1]: tmp-crun.aD7s0X.mount: Deactivated successfully.
Feb 23 08:53:01 np0005626463.localdomain podman[103335]: 2026-02-23 08:53:00.997257777 +0000 UTC m=+0.161416231 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:53:01 np0005626463.localdomain podman[103334]: 2026-02-23 08:53:01.038308576 +0000 UTC m=+0.208496876 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:53:01 np0005626463.localdomain podman[103334]: 2026-02-23 08:53:01.078021564 +0000 UTC m=+0.248209904 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Feb 23 08:53:01 np0005626463.localdomain podman[103334]: unhealthy
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:53:01 np0005626463.localdomain podman[103336]: 2026-02-23 08:53:01.079487678 +0000 UTC m=+0.240828656 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 23 08:53:01 np0005626463.localdomain podman[103335]: 2026-02-23 08:53:01.135527908 +0000 UTC m=+0.299686282 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 23 08:53:01 np0005626463.localdomain podman[103335]: unhealthy
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:53:01 np0005626463.localdomain podman[103336]: 2026-02-23 08:53:01.272346143 +0000 UTC m=+0.433687101 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:53:01 np0005626463.localdomain podman[103333]: 2026-02-23 08:53:01.362298762 +0000 UTC m=+0.533822972 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:53:01 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:53:04 np0005626463.localdomain sudo[103419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:53:04 np0005626463.localdomain sudo[103419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:53:04 np0005626463.localdomain sudo[103419]: pam_unix(sudo:session): session closed for user root
Feb 23 08:53:04 np0005626463.localdomain sudo[103434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 08:53:04 np0005626463.localdomain sudo[103434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:53:05 np0005626463.localdomain systemd[1]: tmp-crun.EyVt2I.mount: Deactivated successfully.
Feb 23 08:53:05 np0005626463.localdomain podman[103519]: 2026-02-23 08:53:05.500062032 +0000 UTC m=+0.103264997 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True)
Feb 23 08:53:05 np0005626463.localdomain podman[103519]: 2026-02-23 08:53:05.60431908 +0000 UTC m=+0.207522015 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, version=7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Feb 23 08:53:05 np0005626463.localdomain sudo[103434]: pam_unix(sudo:session): session closed for user root
Feb 23 08:53:06 np0005626463.localdomain sudo[103588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:53:06 np0005626463.localdomain sudo[103588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:53:06 np0005626463.localdomain sudo[103588]: pam_unix(sudo:session): session closed for user root
Feb 23 08:53:06 np0005626463.localdomain sudo[103603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:53:06 np0005626463.localdomain sudo[103603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:53:06 np0005626463.localdomain sudo[103603]: pam_unix(sudo:session): session closed for user root
Feb 23 08:53:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=46644 SEQ=0 ACK=4038960630 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:53:10 np0005626463.localdomain sudo[103649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:53:10 np0005626463.localdomain sudo[103649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:53:10 np0005626463.localdomain sudo[103649]: pam_unix(sudo:session): session closed for user root
Feb 23 08:53:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:53:20 np0005626463.localdomain podman[103664]: 2026-02-23 08:53:20.92775361 +0000 UTC m=+0.096605274 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64)
Feb 23 08:53:20 np0005626463.localdomain podman[103664]: 2026-02-23 08:53:20.941051097 +0000 UTC m=+0.109902761 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5)
Feb 23 08:53:20 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:53:24 np0005626463.localdomain sshd[103685]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:53:24 np0005626463.localdomain sshd[103685]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: tmp-crun.QhMp8p.mount: Deactivated successfully.
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: tmp-crun.r0vSpG.mount: Deactivated successfully.
Feb 23 08:53:29 np0005626463.localdomain podman[103688]: 2026-02-23 08:53:29.932733232 +0000 UTC m=+0.100164253 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, architecture=x86_64)
Feb 23 08:53:29 np0005626463.localdomain podman[103696]: 2026-02-23 08:53:29.942791951 +0000 UTC m=+0.100616927 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true)
Feb 23 08:53:29 np0005626463.localdomain podman[103688]: 2026-02-23 08:53:29.957182642 +0000 UTC m=+0.124613603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:53:29 np0005626463.localdomain podman[103696]: 2026-02-23 08:53:29.9874356 +0000 UTC m=+0.145260596 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1)
Feb 23 08:53:29 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:53:30 np0005626463.localdomain podman[103687]: 2026-02-23 08:53:29.911905333 +0000 UTC m=+0.083022767 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.5)
Feb 23 08:53:30 np0005626463.localdomain podman[103690]: 2026-02-23 08:53:30.02362395 +0000 UTC m=+0.182897281 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=)
Feb 23 08:53:30 np0005626463.localdomain podman[103687]: 2026-02-23 08:53:30.044249922 +0000 UTC m=+0.215367376 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.13)
Feb 23 08:53:30 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:53:30 np0005626463.localdomain podman[103689]: 2026-02-23 08:53:30.087449316 +0000 UTC m=+0.250628907 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4)
Feb 23 08:53:30 np0005626463.localdomain podman[103690]: 2026-02-23 08:53:30.109646047 +0000 UTC m=+0.268919438 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:53:30 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:53:30 np0005626463.localdomain podman[103689]: 2026-02-23 08:53:30.137725468 +0000 UTC m=+0.300905059 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 23 08:53:30 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:53:31 np0005626463.localdomain podman[103803]: 2026-02-23 08:53:31.932883283 +0000 UTC m=+0.095083857 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:53:31 np0005626463.localdomain podman[103803]: 2026-02-23 08:53:31.944712266 +0000 UTC m=+0.106912910 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:53:31 np0005626463.localdomain podman[103803]: unhealthy
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:53:31 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:53:31 np0005626463.localdomain podman[103802]: 2026-02-23 08:53:31.993182123 +0000 UTC m=+0.154921403 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:53:32 np0005626463.localdomain podman[103802]: 2026-02-23 08:53:32.005795369 +0000 UTC m=+0.167534609 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Feb 23 08:53:32 np0005626463.localdomain podman[103802]: unhealthy
Feb 23 08:53:32 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:53:32 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:53:32 np0005626463.localdomain podman[103801]: 2026-02-23 08:53:32.149224028 +0000 UTC m=+0.318101016 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 23 08:53:32 np0005626463.localdomain podman[103805]: 2026-02-23 08:53:32.064808679 +0000 UTC m=+0.219956416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:53:32 np0005626463.localdomain podman[103805]: 2026-02-23 08:53:32.262374769 +0000 UTC m=+0.417522576 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1)
Feb 23 08:53:32 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:53:32 np0005626463.localdomain podman[103801]: 2026-02-23 08:53:32.538418154 +0000 UTC m=+0.707295172 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Feb 23 08:53:32 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:53:32 np0005626463.localdomain systemd[1]: tmp-crun.Dd0QJg.mount: Deactivated successfully.
Feb 23 08:53:35 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=57016 SEQ=0 ACK=1742402586 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:53:41 np0005626463.localdomain sshd[103893]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:53:41 np0005626463.localdomain sshd[103893]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:53:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:53:51 np0005626463.localdomain systemd[1]: tmp-crun.yPevok.mount: Deactivated successfully.
Feb 23 08:53:51 np0005626463.localdomain podman[103895]: 2026-02-23 08:53:51.919358553 +0000 UTC m=+0.094907712 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64)
Feb 23 08:53:51 np0005626463.localdomain podman[103895]: 2026-02-23 08:53:51.934245449 +0000 UTC m=+0.109794648 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:53:51 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:53:57 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:53:57 np0005626463.localdomain recover_tripleo_nova_virtqemud[103917]: 61982
Feb 23 08:53:57 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:53:57 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:54:00 np0005626463.localdomain systemd[1]: tmp-crun.xYSrKE.mount: Deactivated successfully.
Feb 23 08:54:00 np0005626463.localdomain podman[103918]: 2026-02-23 08:54:00.92618063 +0000 UTC m=+0.099076020 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3)
Feb 23 08:54:00 np0005626463.localdomain podman[103918]: 2026-02-23 08:54:00.933910077 +0000 UTC m=+0.106805417 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, container_name=iscsid, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Feb 23 08:54:00 np0005626463.localdomain podman[103919]: 2026-02-23 08:54:00.97836812 +0000 UTC m=+0.149110374 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1)
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:54:01 np0005626463.localdomain podman[103932]: 2026-02-23 08:54:01.026379723 +0000 UTC m=+0.183788307 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:54:01 np0005626463.localdomain podman[103920]: 2026-02-23 08:54:00.954258711 +0000 UTC m=+0.118625469 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:54:01 np0005626463.localdomain podman[103932]: 2026-02-23 08:54:01.494109627 +0000 UTC m=+0.651518271 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:54:01 np0005626463.localdomain podman[103919]: 2026-02-23 08:54:01.541669456 +0000 UTC m=+0.712411700 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container)
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:54:01 np0005626463.localdomain podman[103921]: 2026-02-23 08:54:01.629546362 +0000 UTC m=+0.794021114 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond)
Feb 23 08:54:01 np0005626463.localdomain podman[103921]: 2026-02-23 08:54:01.639669331 +0000 UTC m=+0.804144123 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5)
Feb 23 08:54:01 np0005626463.localdomain podman[103920]: 2026-02-23 08:54:01.647255844 +0000 UTC m=+0.811622632 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:54:01 np0005626463.localdomain systemd[1]: tmp-crun.KPbhum.mount: Deactivated successfully.
Feb 23 08:54:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:54:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:54:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:54:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:54:03 np0005626463.localdomain podman[104035]: 2026-02-23 08:54:03.059067063 +0000 UTC m=+0.066387947 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:54:03 np0005626463.localdomain podman[104037]: 2026-02-23 08:54:03.041925857 +0000 UTC m=+0.051855021 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5)
Feb 23 08:54:03 np0005626463.localdomain podman[104036]: 2026-02-23 08:54:03.093262382 +0000 UTC m=+0.103293139 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 23 08:54:03 np0005626463.localdomain podman[104037]: 2026-02-23 08:54:03.120143706 +0000 UTC m=+0.130072850 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:54:03 np0005626463.localdomain podman[104037]: unhealthy
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:54:03 np0005626463.localdomain podman[104036]: 2026-02-23 08:54:03.176195135 +0000 UTC m=+0.186225902 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:54:03 np0005626463.localdomain podman[104036]: unhealthy
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:54:03 np0005626463.localdomain podman[104038]: 2026-02-23 08:54:03.219037179 +0000 UTC m=+0.222757802 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 23 08:54:03 np0005626463.localdomain podman[104038]: 2026-02-23 08:54:03.386206944 +0000 UTC m=+0.389927567 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:54:03 np0005626463.localdomain podman[104035]: 2026-02-23 08:54:03.423141628 +0000 UTC m=+0.430462512 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:54:03 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:54:09 np0005626463.localdomain sshd[104126]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:54:10 np0005626463.localdomain sshd[104126]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:54:10 np0005626463.localdomain sudo[104128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:54:10 np0005626463.localdomain sudo[104128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:54:10 np0005626463.localdomain sudo[104128]: pam_unix(sudo:session): session closed for user root
Feb 23 08:54:10 np0005626463.localdomain sudo[104143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:54:10 np0005626463.localdomain sudo[104143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:54:11 np0005626463.localdomain sudo[104143]: pam_unix(sudo:session): session closed for user root
Feb 23 08:54:13 np0005626463.localdomain sudo[104190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:54:13 np0005626463.localdomain sudo[104190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:54:13 np0005626463.localdomain sudo[104190]: pam_unix(sudo:session): session closed for user root
Feb 23 08:54:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:54:22 np0005626463.localdomain podman[104206]: 2026-02-23 08:54:22.910100246 +0000 UTC m=+0.084617055 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, container_name=collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:54:22 np0005626463.localdomain podman[104206]: 2026-02-23 08:54:22.924555008 +0000 UTC m=+0.099071807 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:54:22 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:54:29 np0005626463.localdomain sshd[104226]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:54:31 np0005626463.localdomain sshd[104226]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: tmp-crun.TAUEkQ.mount: Deactivated successfully.
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:54:31 np0005626463.localdomain podman[104228]: 2026-02-23 08:54:31.684286108 +0000 UTC m=+0.144593189 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:54:31 np0005626463.localdomain podman[104229]: 2026-02-23 08:54:31.655737676 +0000 UTC m=+0.114550860 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5)
Feb 23 08:54:31 np0005626463.localdomain podman[104228]: 2026-02-23 08:54:31.72085824 +0000 UTC m=+0.181165321 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:54:31 np0005626463.localdomain podman[104259]: 2026-02-23 08:54:31.774961152 +0000 UTC m=+0.106982995 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 23 08:54:31 np0005626463.localdomain podman[104279]: 2026-02-23 08:54:31.790773335 +0000 UTC m=+0.085828812 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:54:31 np0005626463.localdomain podman[104229]: 2026-02-23 08:54:31.794072758 +0000 UTC m=+0.252885962 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.openshift.expose-services=, container_name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:54:31 np0005626463.localdomain podman[104279]: 2026-02-23 08:54:31.821678781 +0000 UTC m=+0.116737158 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:54:31 np0005626463.localdomain podman[104259]: 2026-02-23 08:54:31.833114608 +0000 UTC m=+0.165136391 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:54:31 np0005626463.localdomain podman[104280]: 2026-02-23 08:54:31.878004121 +0000 UTC m=+0.172714757 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, release=1766032510, vcs-type=git, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 08:54:31 np0005626463.localdomain podman[104280]: 2026-02-23 08:54:31.888283262 +0000 UTC m=+0.182993848 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true)
Feb 23 08:54:31 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:54:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:54:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:54:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:54:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:54:33 np0005626463.localdomain podman[104347]: 2026-02-23 08:54:33.930343969 +0000 UTC m=+0.099697116 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:54:33 np0005626463.localdomain systemd[1]: tmp-crun.Dx24rU.mount: Deactivated successfully.
Feb 23 08:54:33 np0005626463.localdomain podman[104348]: 2026-02-23 08:54:33.98127204 +0000 UTC m=+0.145765795 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13)
Feb 23 08:54:34 np0005626463.localdomain podman[104347]: 2026-02-23 08:54:34.063624394 +0000 UTC m=+0.232977531 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, vcs-type=git)
Feb 23 08:54:34 np0005626463.localdomain podman[104347]: unhealthy
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:54:34 np0005626463.localdomain podman[104346]: 2026-02-23 08:54:34.079690895 +0000 UTC m=+0.250114236 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:54:34 np0005626463.localdomain podman[104345]: 2026-02-23 08:54:34.035796715 +0000 UTC m=+0.208938660 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:54:34 np0005626463.localdomain podman[104346]: 2026-02-23 08:54:34.129579484 +0000 UTC m=+0.300002825 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, batch=17.1_20260112.1)
Feb 23 08:54:34 np0005626463.localdomain podman[104346]: unhealthy
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:54:34 np0005626463.localdomain podman[104348]: 2026-02-23 08:54:34.185098469 +0000 UTC m=+0.349592204 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:54:34 np0005626463.localdomain podman[104345]: 2026-02-23 08:54:34.418548004 +0000 UTC m=+0.591689919 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:54:34 np0005626463.localdomain systemd[1]: tmp-crun.PnlzZf.mount: Deactivated successfully.
Feb 23 08:54:42 np0005626463.localdomain sshd[104431]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:54:42 np0005626463.localdomain sshd[104431]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:54:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:54:53 np0005626463.localdomain systemd[1]: tmp-crun.JhWjW3.mount: Deactivated successfully.
Feb 23 08:54:53 np0005626463.localdomain podman[104433]: 2026-02-23 08:54:53.927620494 +0000 UTC m=+0.098223301 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true)
Feb 23 08:54:53 np0005626463.localdomain podman[104433]: 2026-02-23 08:54:53.96366932 +0000 UTC m=+0.134272127 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:54:53 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:55:01 np0005626463.localdomain podman[104454]: 2026-02-23 08:55:01.910157671 +0000 UTC m=+0.078592537 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:55:01 np0005626463.localdomain podman[104454]: 2026-02-23 08:55:01.946657561 +0000 UTC m=+0.115092377 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step5)
Feb 23 08:55:01 np0005626463.localdomain podman[104453]: 2026-02-23 08:55:01.95780281 +0000 UTC m=+0.133472242 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, release=1766032510, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5)
Feb 23 08:55:01 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:55:01 np0005626463.localdomain podman[104453]: 2026-02-23 08:55:01.996131056 +0000 UTC m=+0.171800458 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510)
Feb 23 08:55:02 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:55:02 np0005626463.localdomain podman[104487]: 2026-02-23 08:55:02.02149374 +0000 UTC m=+0.093761221 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z)
Feb 23 08:55:02 np0005626463.localdomain podman[104489]: 2026-02-23 08:55:02.07302326 +0000 UTC m=+0.138175508 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:55:02 np0005626463.localdomain podman[104489]: 2026-02-23 08:55:02.087234534 +0000 UTC m=+0.152386712 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:55:02 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:55:02 np0005626463.localdomain podman[104488]: 2026-02-23 08:55:02.135684717 +0000 UTC m=+0.202716224 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:55:02 np0005626463.localdomain podman[104487]: 2026-02-23 08:55:02.153475344 +0000 UTC m=+0.225742855 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:55:02 np0005626463.localdomain podman[104488]: 2026-02-23 08:55:02.167663157 +0000 UTC m=+0.234694704 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:55:02 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:55:02 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:55:02 np0005626463.localdomain sshd[104571]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:55:02 np0005626463.localdomain systemd[1]: tmp-crun.mjc5Nk.mount: Deactivated successfully.
Feb 23 08:55:04 np0005626463.localdomain sshd[104571]: Connection closed by authenticating user nobody 80.94.95.116 port 32774 [preauth]
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:55:04 np0005626463.localdomain podman[104573]: 2026-02-23 08:55:04.214029939 +0000 UTC m=+0.082605602 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:55:04 np0005626463.localdomain podman[104573]: 2026-02-23 08:55:04.263377561 +0000 UTC m=+0.131953224 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64)
Feb 23 08:55:04 np0005626463.localdomain podman[104573]: unhealthy
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: tmp-crun.vNNAAI.mount: Deactivated successfully.
Feb 23 08:55:04 np0005626463.localdomain podman[104596]: 2026-02-23 08:55:04.357972957 +0000 UTC m=+0.119970970 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 23 08:55:04 np0005626463.localdomain podman[104595]: 2026-02-23 08:55:04.32830595 +0000 UTC m=+0.089723945 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13)
Feb 23 08:55:04 np0005626463.localdomain podman[104595]: 2026-02-23 08:55:04.412406427 +0000 UTC m=+0.173824442 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 08:55:04 np0005626463.localdomain podman[104595]: unhealthy
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:55:04 np0005626463.localdomain podman[104596]: 2026-02-23 08:55:04.558261295 +0000 UTC m=+0.320259278 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1)
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:55:04 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:55:04 np0005626463.localdomain podman[104642]: 2026-02-23 08:55:04.671506294 +0000 UTC m=+0.085167053 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 23 08:55:05 np0005626463.localdomain podman[104642]: 2026-02-23 08:55:05.010198716 +0000 UTC m=+0.423859415 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64)
Feb 23 08:55:05 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:55:13 np0005626463.localdomain sudo[104666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:55:13 np0005626463.localdomain sudo[104666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:55:13 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:55:13 np0005626463.localdomain sudo[104666]: pam_unix(sudo:session): session closed for user root
Feb 23 08:55:14 np0005626463.localdomain recover_tripleo_nova_virtqemud[104682]: 61982
Feb 23 08:55:14 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:55:14 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:55:14 np0005626463.localdomain sudo[104683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:55:14 np0005626463.localdomain sudo[104683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:55:14 np0005626463.localdomain sudo[104683]: pam_unix(sudo:session): session closed for user root
Feb 23 08:55:15 np0005626463.localdomain sudo[104731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:55:15 np0005626463.localdomain sudo[104731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:55:15 np0005626463.localdomain sudo[104731]: pam_unix(sudo:session): session closed for user root
Feb 23 08:55:18 np0005626463.localdomain sshd[104746]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:55:18 np0005626463.localdomain sshd[104746]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:55:20 np0005626463.localdomain sshd[104748]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:55:21 np0005626463.localdomain sshd[104748]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:55:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:55:24 np0005626463.localdomain podman[104750]: 2026-02-23 08:55:24.938182455 +0000 UTC m=+0.103779063 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true)
Feb 23 08:55:24 np0005626463.localdomain podman[104750]: 2026-02-23 08:55:24.977256776 +0000 UTC m=+0.142853344 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, container_name=collectd, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:55:24 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:55:32 np0005626463.localdomain podman[104771]: 2026-02-23 08:55:32.936437351 +0000 UTC m=+0.099445258 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Feb 23 08:55:32 np0005626463.localdomain podman[104771]: 2026-02-23 08:55:32.97480889 +0000 UTC m=+0.137816827 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64)
Feb 23 08:55:32 np0005626463.localdomain systemd[1]: tmp-crun.2sgnoy.mount: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain podman[104772]: 2026-02-23 08:55:33.039661557 +0000 UTC m=+0.199399072 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z)
Feb 23 08:55:33 np0005626463.localdomain podman[104770]: 2026-02-23 08:55:33.006184741 +0000 UTC m=+0.171937343 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 08:55:33 np0005626463.localdomain podman[104770]: 2026-02-23 08:55:33.085775597 +0000 UTC m=+0.251528169 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain podman[104772]: 2026-02-23 08:55:33.105786253 +0000 UTC m=+0.265523708 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain podman[104773]: 2026-02-23 08:55:33.106585368 +0000 UTC m=+0.262503614 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public)
Feb 23 08:55:33 np0005626463.localdomain podman[104773]: 2026-02-23 08:55:33.190431738 +0000 UTC m=+0.346349974 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain podman[104782]: 2026-02-23 08:55:33.195291069 +0000 UTC m=+0.348739787 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:55:33 np0005626463.localdomain podman[104782]: 2026-02-23 08:55:33.275731773 +0000 UTC m=+0.429180451 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:55:33 np0005626463.localdomain systemd[1]: tmp-crun.MssDk5.mount: Deactivated successfully.
Feb 23 08:55:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:55:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:55:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:55:34 np0005626463.localdomain podman[104890]: 2026-02-23 08:55:34.905333372 +0000 UTC m=+0.075942004 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=)
Feb 23 08:55:34 np0005626463.localdomain podman[104888]: 2026-02-23 08:55:34.96608569 +0000 UTC m=+0.140826732 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.13, release=1766032510, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1)
Feb 23 08:55:34 np0005626463.localdomain podman[104888]: 2026-02-23 08:55:34.980370557 +0000 UTC m=+0.155111569 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=)
Feb 23 08:55:34 np0005626463.localdomain podman[104888]: unhealthy
Feb 23 08:55:34 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:55:34 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:55:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:55:35 np0005626463.localdomain podman[104889]: 2026-02-23 08:55:35.081190007 +0000 UTC m=+0.253238175 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:55:35 np0005626463.localdomain podman[104889]: 2026-02-23 08:55:35.098274781 +0000 UTC m=+0.270323019 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com)
Feb 23 08:55:35 np0005626463.localdomain podman[104889]: unhealthy
Feb 23 08:55:35 np0005626463.localdomain podman[104890]: 2026-02-23 08:55:35.106541429 +0000 UTC m=+0.277150061 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:55:35 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:55:35 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:55:35 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:55:35 np0005626463.localdomain podman[104954]: 2026-02-23 08:55:35.176609378 +0000 UTC m=+0.082589732 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1)
Feb 23 08:55:35 np0005626463.localdomain podman[104954]: 2026-02-23 08:55:35.573274592 +0000 UTC m=+0.479254926 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4)
Feb 23 08:55:35 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:55:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:55:55 np0005626463.localdomain systemd[1]: tmp-crun.xwTuyG.mount: Deactivated successfully.
Feb 23 08:55:55 np0005626463.localdomain podman[104979]: 2026-02-23 08:55:55.933967593 +0000 UTC m=+0.097037892 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, container_name=collectd, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:55:55 np0005626463.localdomain podman[104979]: 2026-02-23 08:55:55.970390761 +0000 UTC m=+0.133461020 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:55:55 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:55:57 np0005626463.localdomain sshd[105000]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:55:58 np0005626463.localdomain sshd[105000]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:56:03 np0005626463.localdomain podman[105003]: 2026-02-23 08:56:03.929961511 +0000 UTC m=+0.096854128 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:56:03 np0005626463.localdomain systemd[1]: tmp-crun.oz2MyW.mount: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain podman[105005]: 2026-02-23 08:56:03.999807333 +0000 UTC m=+0.157618347 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:56:04 np0005626463.localdomain podman[105003]: 2026-02-23 08:56:04.015683558 +0000 UTC m=+0.182576215 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, release=1766032510, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13)
Feb 23 08:56:04 np0005626463.localdomain podman[105005]: 2026-02-23 08:56:04.036386596 +0000 UTC m=+0.194197550 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 23 08:56:04 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain podman[105011]: 2026-02-23 08:56:03.949513322 +0000 UTC m=+0.105001063 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:56:04 np0005626463.localdomain podman[105011]: 2026-02-23 08:56:04.07974611 +0000 UTC m=+0.235233851 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 23 08:56:04 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain podman[105004]: 2026-02-23 08:56:04.040336129 +0000 UTC m=+0.202551260 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:56:04 np0005626463.localdomain podman[105002]: 2026-02-23 08:56:04.147185897 +0000 UTC m=+0.316728927 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:56:04 np0005626463.localdomain podman[105002]: 2026-02-23 08:56:04.161391951 +0000 UTC m=+0.330935021 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:56:04 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain podman[105004]: 2026-02-23 08:56:04.17733346 +0000 UTC m=+0.339548571 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Feb 23 08:56:04 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:56:04 np0005626463.localdomain sshd[105118]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:56:04 np0005626463.localdomain sshd[105120]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:56:05 np0005626463.localdomain sshd[105122]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:56:05 np0005626463.localdomain sshd[105120]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:56:05 np0005626463.localdomain podman[105123]: 2026-02-23 08:56:05.32371223 +0000 UTC m=+0.087104763 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 23 08:56:05 np0005626463.localdomain podman[105124]: 2026-02-23 08:56:05.385287994 +0000 UTC m=+0.146296182 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 08:56:05 np0005626463.localdomain podman[105123]: 2026-02-23 08:56:05.395331908 +0000 UTC m=+0.158724451 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, container_name=ovn_controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 23 08:56:05 np0005626463.localdomain podman[105123]: unhealthy
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:56:05 np0005626463.localdomain podman[105124]: 2026-02-23 08:56:05.421937819 +0000 UTC m=+0.182946007 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Feb 23 08:56:05 np0005626463.localdomain podman[105124]: unhealthy
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:56:05 np0005626463.localdomain sshd[105122]: error: kex_exchange_identification: client sent invalid protocol identifier "MGLNDD_38.102.83.164_22"
Feb 23 08:56:05 np0005626463.localdomain sshd[105122]: banner exchange: Connection from 40.124.175.86 port 59118: invalid format
Feb 23 08:56:05 np0005626463.localdomain podman[105125]: 2026-02-23 08:56:05.485533566 +0000 UTC m=+0.243565731 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:56:05 np0005626463.localdomain podman[105125]: 2026-02-23 08:56:05.686424534 +0000 UTC m=+0.444456709 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible)
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:56:05 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:56:05 np0005626463.localdomain podman[105191]: 2026-02-23 08:56:05.79608578 +0000 UTC m=+0.084197062 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, distribution-scope=public)
Feb 23 08:56:06 np0005626463.localdomain podman[105191]: 2026-02-23 08:56:06.243238442 +0000 UTC m=+0.531349714 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 08:56:06 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:56:14 np0005626463.localdomain sshd[105118]: Connection closed by 40.124.175.86 port 59116 [preauth]
Feb 23 08:56:15 np0005626463.localdomain sudo[105215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:56:15 np0005626463.localdomain sudo[105215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:56:15 np0005626463.localdomain sudo[105215]: pam_unix(sudo:session): session closed for user root
Feb 23 08:56:15 np0005626463.localdomain sudo[105230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:56:15 np0005626463.localdomain sudo[105230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:56:16 np0005626463.localdomain sudo[105230]: pam_unix(sudo:session): session closed for user root
Feb 23 08:56:17 np0005626463.localdomain sudo[105276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:56:17 np0005626463.localdomain sudo[105276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:56:17 np0005626463.localdomain sudo[105276]: pam_unix(sudo:session): session closed for user root
Feb 23 08:56:26 np0005626463.localdomain kernel: DROPPING: IN=vlan20 OUT= MACSRC=8a:7d:df:61:60:7d MACDST=f6:9b:d3:a0:7a:ad MACPROTO=0800 SRC=172.17.0.104 DST=172.17.0.106 LEN=40 TOS=0x00 PREC=0xC0 TTL=64 ID=0 DF PROTO=TCP SPT=6642 DPT=40168 SEQ=0 ACK=3692768849 WINDOW=0 RES=0x00 ACK RST URGP=0 
Feb 23 08:56:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:56:26 np0005626463.localdomain podman[105291]: 2026-02-23 08:56:26.921929115 +0000 UTC m=+0.091632761 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true)
Feb 23 08:56:26 np0005626463.localdomain podman[105291]: 2026-02-23 08:56:26.936181786 +0000 UTC m=+0.105885472 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, container_name=collectd, release=1766032510, vcs-type=git, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 23 08:56:26 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:56:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:56:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:56:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:56:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:56:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:56:34 np0005626463.localdomain podman[105312]: 2026-02-23 08:56:34.93317931 +0000 UTC m=+0.099769723 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:56:34 np0005626463.localdomain podman[105313]: 2026-02-23 08:56:34.978178364 +0000 UTC m=+0.140652858 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:56:35 np0005626463.localdomain podman[105311]: 2026-02-23 08:56:35.028858674 +0000 UTC m=+0.197565322 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 23 08:56:35 np0005626463.localdomain podman[105313]: 2026-02-23 08:56:35.035304454 +0000 UTC m=+0.197778908 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64)
Feb 23 08:56:35 np0005626463.localdomain podman[105311]: 2026-02-23 08:56:35.042288091 +0000 UTC m=+0.210994709 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, container_name=iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain podman[105312]: 2026-02-23 08:56:35.09617033 +0000 UTC m=+0.262760743 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4)
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain podman[105327]: 2026-02-23 08:56:35.108502092 +0000 UTC m=+0.260495432 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step5, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:56:35 np0005626463.localdomain podman[105327]: 2026-02-23 08:56:35.18431431 +0000 UTC m=+0.336307650 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_id=tripleo_step5, container_name=nova_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z)
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain podman[105314]: 2026-02-23 08:56:35.198928664 +0000 UTC m=+0.354824205 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com)
Feb 23 08:56:35 np0005626463.localdomain podman[105314]: 2026-02-23 08:56:35.212150993 +0000 UTC m=+0.368046534 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:56:35 np0005626463.localdomain podman[105432]: 2026-02-23 08:56:35.912282135 +0000 UTC m=+0.082461115 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=metrics_qdr, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:56:35 np0005626463.localdomain systemd[1]: tmp-crun.nofmvD.mount: Deactivated successfully.
Feb 23 08:56:35 np0005626463.localdomain podman[105431]: 2026-02-23 08:56:35.973990127 +0000 UTC m=+0.146440028 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:56:36 np0005626463.localdomain podman[105431]: 2026-02-23 08:56:36.021162269 +0000 UTC m=+0.193612140 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13)
Feb 23 08:56:36 np0005626463.localdomain podman[105431]: unhealthy
Feb 23 08:56:36 np0005626463.localdomain podman[105430]: 2026-02-23 08:56:36.028841027 +0000 UTC m=+0.203918599 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:56:36 np0005626463.localdomain podman[105430]: 2026-02-23 08:56:36.073285364 +0000 UTC m=+0.248362906 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, container_name=ovn_controller, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:56:36 np0005626463.localdomain podman[105430]: unhealthy
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:56:36 np0005626463.localdomain sshd[105496]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:56:36 np0005626463.localdomain podman[105432]: 2026-02-23 08:56:36.116306157 +0000 UTC m=+0.286485177 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=metrics_qdr, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:56:36 np0005626463.localdomain sshd[105496]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:56:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:56:36 np0005626463.localdomain podman[105498]: 2026-02-23 08:56:36.727888905 +0000 UTC m=+0.087273145 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:56:37 np0005626463.localdomain podman[105498]: 2026-02-23 08:56:37.115536536 +0000 UTC m=+0.474920786 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:56:37 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:56:48 np0005626463.localdomain sshd[105521]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:56:49 np0005626463.localdomain sshd[105521]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:56:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:56:57 np0005626463.localdomain podman[105523]: 2026-02-23 08:56:57.942616036 +0000 UTC m=+0.085436998 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=)
Feb 23 08:56:57 np0005626463.localdomain podman[105523]: 2026-02-23 08:56:57.95046562 +0000 UTC m=+0.093286602 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, version=17.1.13, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:56:57 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:57:05 np0005626463.localdomain recover_tripleo_nova_virtqemud[105575]: 61982
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:57:05 np0005626463.localdomain systemd[1]: tmp-crun.GC6hPE.mount: Deactivated successfully.
Feb 23 08:57:05 np0005626463.localdomain podman[105543]: 2026-02-23 08:57:05.920044282 +0000 UTC m=+0.083893610 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:57:05 np0005626463.localdomain podman[105551]: 2026-02-23 08:57:05.99195034 +0000 UTC m=+0.144571880 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=)
Feb 23 08:57:06 np0005626463.localdomain podman[105543]: 2026-02-23 08:57:06.0012923 +0000 UTC m=+0.165141638 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 23 08:57:06 np0005626463.localdomain podman[105545]: 2026-02-23 08:57:05.961182427 +0000 UTC m=+0.118984028 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:57:06 np0005626463.localdomain podman[105544]: 2026-02-23 08:57:06.019475132 +0000 UTC m=+0.179708328 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:57:06 np0005626463.localdomain podman[105545]: 2026-02-23 08:57:06.041901977 +0000 UTC m=+0.199703578 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:57:06 np0005626463.localdomain podman[105551]: 2026-02-23 08:57:06.048183263 +0000 UTC m=+0.200804773 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain podman[105542]: 2026-02-23 08:57:06.090940447 +0000 UTC m=+0.258619323 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:57:06 np0005626463.localdomain podman[105544]: 2026-02-23 08:57:06.094982582 +0000 UTC m=+0.255215828 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc.)
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain podman[105542]: 2026-02-23 08:57:06.151810833 +0000 UTC m=+0.319489719 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 23 08:57:06 np0005626463.localdomain podman[105649]: 2026-02-23 08:57:06.162812844 +0000 UTC m=+0.092957242 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain podman[105649]: 2026-02-23 08:57:06.181177833 +0000 UTC m=+0.111322201 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true)
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:57:06 np0005626463.localdomain podman[105649]: unhealthy
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:57:06 np0005626463.localdomain podman[105669]: 2026-02-23 08:57:06.247524049 +0000 UTC m=+0.129297467 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:57:06 np0005626463.localdomain podman[105669]: 2026-02-23 08:57:06.26661345 +0000 UTC m=+0.148386858 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 23 08:57:06 np0005626463.localdomain podman[105669]: unhealthy
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:57:06 np0005626463.localdomain podman[105689]: 2026-02-23 08:57:06.342183552 +0000 UTC m=+0.143934171 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:57:06 np0005626463.localdomain podman[105689]: 2026-02-23 08:57:06.573469147 +0000 UTC m=+0.375219806 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:57:06 np0005626463.localdomain systemd[1]: tmp-crun.v8tlsG.mount: Deactivated successfully.
Feb 23 08:57:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:57:07 np0005626463.localdomain podman[105728]: 2026-02-23 08:57:07.938095217 +0000 UTC m=+0.108559253 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:57:08 np0005626463.localdomain podman[105728]: 2026-02-23 08:57:08.354564671 +0000 UTC m=+0.525028657 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute)
Feb 23 08:57:08 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:57:13 np0005626463.localdomain sshd[105751]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:57:14 np0005626463.localdomain sshd[105751]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:57:17 np0005626463.localdomain sudo[105753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:57:17 np0005626463.localdomain sudo[105753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:57:17 np0005626463.localdomain sudo[105753]: pam_unix(sudo:session): session closed for user root
Feb 23 08:57:17 np0005626463.localdomain sudo[105768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:57:17 np0005626463.localdomain sudo[105768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:57:18 np0005626463.localdomain sudo[105768]: pam_unix(sudo:session): session closed for user root
Feb 23 08:57:18 np0005626463.localdomain sudo[105814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:57:18 np0005626463.localdomain sudo[105814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:57:18 np0005626463.localdomain sudo[105814]: pam_unix(sudo:session): session closed for user root
Feb 23 08:57:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:57:28 np0005626463.localdomain podman[105829]: 2026-02-23 08:57:28.924079807 +0000 UTC m=+0.096127610 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 23 08:57:28 np0005626463.localdomain podman[105829]: 2026-02-23 08:57:28.966316245 +0000 UTC m=+0.138363998 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 23 08:57:28 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:57:31 np0005626463.localdomain sshd[105849]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:57:33 np0005626463.localdomain sshd[105849]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: tmp-crun.fiXAnD.mount: Deactivated successfully.
Feb 23 08:57:36 np0005626463.localdomain podman[105852]: 2026-02-23 08:57:36.934510365 +0000 UTC m=+0.101080152 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5)
Feb 23 08:57:36 np0005626463.localdomain systemd[1]: tmp-crun.Ur6jRp.mount: Deactivated successfully.
Feb 23 08:57:36 np0005626463.localdomain podman[105866]: 2026-02-23 08:57:36.990378226 +0000 UTC m=+0.139897275 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, container_name=logrotate_crond, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public)
Feb 23 08:57:37 np0005626463.localdomain podman[105856]: 2026-02-23 08:57:37.010239352 +0000 UTC m=+0.156823740 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 23 08:57:37 np0005626463.localdomain podman[105860]: 2026-02-23 08:57:36.964696831 +0000 UTC m=+0.117429210 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13)
Feb 23 08:57:37 np0005626463.localdomain podman[105852]: 2026-02-23 08:57:37.018393124 +0000 UTC m=+0.184962941 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:57:37 np0005626463.localdomain podman[105851]: 2026-02-23 08:57:36.981275185 +0000 UTC m=+0.152571109 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Feb 23 08:57:37 np0005626463.localdomain podman[105860]: 2026-02-23 08:57:37.04373863 +0000 UTC m=+0.196471089 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:57:37 np0005626463.localdomain podman[105851]: 2026-02-23 08:57:37.060247311 +0000 UTC m=+0.231543225 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1)
Feb 23 08:57:37 np0005626463.localdomain podman[105851]: unhealthy
Feb 23 08:57:37 np0005626463.localdomain podman[105866]: 2026-02-23 08:57:37.073551754 +0000 UTC m=+0.223070803 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:57:37 np0005626463.localdomain podman[105856]: 2026-02-23 08:57:37.098419094 +0000 UTC m=+0.245003532 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 23 08:57:37 np0005626463.localdomain podman[105856]: unhealthy
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:57:37 np0005626463.localdomain podman[105873]: 2026-02-23 08:57:37.119842737 +0000 UTC m=+0.252710520 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=)
Feb 23 08:57:37 np0005626463.localdomain podman[105872]: 2026-02-23 08:57:37.142468618 +0000 UTC m=+0.294578357 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:57:37 np0005626463.localdomain podman[105872]: 2026-02-23 08:57:37.201222459 +0000 UTC m=+0.353332248 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=nova_compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully.
Feb 23 08:57:37 np0005626463.localdomain podman[105853]: 2026-02-23 08:57:37.203795169 +0000 UTC m=+0.367737485 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 23 08:57:37 np0005626463.localdomain podman[105853]: 2026-02-23 08:57:37.282495867 +0000 UTC m=+0.446438153 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:57:37 np0005626463.localdomain podman[105873]: 2026-02-23 08:57:37.361237746 +0000 UTC m=+0.494105609 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=metrics_qdr, release=1766032510, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 23 08:57:37 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:57:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:57:38 np0005626463.localdomain podman[106029]: 2026-02-23 08:57:38.906483123 +0000 UTC m=+0.082293570 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2026-01-12T23:32:04Z, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 08:57:39 np0005626463.localdomain podman[106029]: 2026-02-23 08:57:39.312833993 +0000 UTC m=+0.488644400 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:57:39 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:57:52 np0005626463.localdomain sshd[106053]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:57:52 np0005626463.localdomain sshd[106053]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:57:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:57:59 np0005626463.localdomain podman[106055]: 2026-02-23 08:57:59.928612145 +0000 UTC m=+0.097602145 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 23 08:57:59 np0005626463.localdomain podman[106055]: 2026-02-23 08:57:59.941105172 +0000 UTC m=+0.110095152 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 08:57:59 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:58:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:58:07 np0005626463.localdomain podman[106084]: 2026-02-23 08:58:07.951362475 +0000 UTC m=+0.108814622 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 23 08:58:07 np0005626463.localdomain podman[106076]: 2026-02-23 08:58:07.932826801 +0000 UTC m=+0.104117537 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 23 08:58:07 np0005626463.localdomain podman[106077]: 2026-02-23 08:58:07.995489223 +0000 UTC m=+0.161828735 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 08:58:08 np0005626463.localdomain podman[106084]: 2026-02-23 08:58:08.004618515 +0000 UTC m=+0.162070652 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:58:08 np0005626463.localdomain podman[106076]: 2026-02-23 08:58:08.011889551 +0000 UTC m=+0.183180267 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:58:08 np0005626463.localdomain podman[106090]: 2026-02-23 08:58:08.073230211 +0000 UTC m=+0.206275812 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, config_id=tripleo_step4)
Feb 23 08:58:08 np0005626463.localdomain podman[106078]: 2026-02-23 08:58:08.084073527 +0000 UTC m=+0.245045793 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:58:08 np0005626463.localdomain podman[106090]: 2026-02-23 08:58:08.088192695 +0000 UTC m=+0.221238256 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:58:08 np0005626463.localdomain podman[106078]: 2026-02-23 08:58:08.12935064 +0000 UTC m=+0.290322936 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:58:08 np0005626463.localdomain podman[106078]: unhealthy
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:58:08 np0005626463.localdomain podman[106075]: 2026-02-23 08:58:08.146807411 +0000 UTC m=+0.320255244 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 23 08:58:08 np0005626463.localdomain podman[106075]: 2026-02-23 08:58:08.189435672 +0000 UTC m=+0.362883505 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 23 08:58:08 np0005626463.localdomain podman[106075]: unhealthy
Feb 23 08:58:08 np0005626463.localdomain podman[106098]: 2026-02-23 08:58:08.19908506 +0000 UTC m=+0.342259905 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:58:08 np0005626463.localdomain podman[106096]: 2026-02-23 08:58:08.247821401 +0000 UTC m=+0.399889721 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 08:58:08 np0005626463.localdomain podman[106077]: 2026-02-23 08:58:08.276747347 +0000 UTC m=+0.443086849 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:58:08 np0005626463.localdomain podman[106096]: 2026-02-23 08:58:08.297351155 +0000 UTC m=+0.449419505 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true)
Feb 23 08:58:08 np0005626463.localdomain podman[106096]: unhealthy
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:58:08 np0005626463.localdomain podman[106098]: 2026-02-23 08:58:08.414426742 +0000 UTC m=+0.557601617 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13)
Feb 23 08:58:08 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:58:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:58:09 np0005626463.localdomain podman[106256]: 2026-02-23 08:58:09.911529738 +0000 UTC m=+0.085566993 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4)
Feb 23 08:58:10 np0005626463.localdomain podman[106256]: 2026-02-23 08:58:10.329451026 +0000 UTC m=+0.503488221 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true)
Feb 23 08:58:10 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:58:17 np0005626463.localdomain sshd[106281]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:58:18 np0005626463.localdomain sudo[106283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:58:18 np0005626463.localdomain sudo[106283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:58:18 np0005626463.localdomain sudo[106283]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:18 np0005626463.localdomain sudo[106298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:58:18 np0005626463.localdomain sudo[106298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:58:19 np0005626463.localdomain sudo[106298]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:20 np0005626463.localdomain sudo[106344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:58:20 np0005626463.localdomain sudo[106344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:58:20 np0005626463.localdomain sudo[106344]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:20 np0005626463.localdomain sshd[106281]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:58:27 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:58:27 np0005626463.localdomain recover_tripleo_nova_virtqemud[106360]: 61982
Feb 23 08:58:27 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:58:27 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:58:29 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62259 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD814550000000001030307) 
Feb 23 08:58:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:58:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62260 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD818470000000001030307) 
Feb 23 08:58:30 np0005626463.localdomain systemd[1]: tmp-crun.tDBFJU.mount: Deactivated successfully.
Feb 23 08:58:30 np0005626463.localdomain podman[106361]: 2026-02-23 08:58:30.934922291 +0000 UTC m=+0.108981867 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5)
Feb 23 08:58:30 np0005626463.localdomain podman[106361]: 2026-02-23 08:58:30.94832204 +0000 UTC m=+0.122381606 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 23 08:58:30 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:58:32 np0005626463.localdomain sshd[106381]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:58:32 np0005626463.localdomain sshd[106381]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:58:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62261 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD820460000000001030307) 
Feb 23 08:58:33 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=428 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8213B0000000001030307) 
Feb 23 08:58:34 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=429 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD825460000000001030307) 
Feb 23 08:58:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=430 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD82D460000000001030307) 
Feb 23 08:58:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62262 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD830070000000001030307) 
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:58:38 np0005626463.localdomain systemd[1]: tmp-crun.6daTq0.mount: Deactivated successfully.
Feb 23 08:58:38 np0005626463.localdomain podman[106384]: 2026-02-23 08:58:38.965639034 +0000 UTC m=+0.129440785 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: tmp-crun.KAmkbo.mount: Deactivated successfully.
Feb 23 08:58:39 np0005626463.localdomain podman[106405]: 2026-02-23 08:58:39.011669492 +0000 UTC m=+0.150003957 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step5, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public)
Feb 23 08:58:39 np0005626463.localdomain podman[106383]: 2026-02-23 08:58:39.035948201 +0000 UTC m=+0.200005180 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:58:39 np0005626463.localdomain podman[106395]: 2026-02-23 08:58:39.052925932 +0000 UTC m=+0.191222066 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Feb 23 08:58:39 np0005626463.localdomain podman[106386]: 2026-02-23 08:58:39.096993869 +0000 UTC m=+0.249433115 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:58:39 np0005626463.localdomain podman[106395]: 2026-02-23 08:58:39.09960138 +0000 UTC m=+0.237897484 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:58:39 np0005626463.localdomain podman[106386]: 2026-02-23 08:58:39.114219367 +0000 UTC m=+0.266658623 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, version=17.1.13, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true)
Feb 23 08:58:39 np0005626463.localdomain podman[106386]: unhealthy
Feb 23 08:58:39 np0005626463.localdomain podman[106405]: 2026-02-23 08:58:39.125898912 +0000 UTC m=+0.264233397 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:58:39 np0005626463.localdomain podman[106405]: unhealthy
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:58:39 np0005626463.localdomain podman[106404]: 2026-02-23 08:58:39.103976116 +0000 UTC m=+0.247531055 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:58:39 np0005626463.localdomain podman[106383]: 2026-02-23 08:58:39.169603388 +0000 UTC m=+0.333660427 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, release=1766032510, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, version=17.1.13)
Feb 23 08:58:39 np0005626463.localdomain podman[106383]: unhealthy
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:58:39 np0005626463.localdomain podman[106385]: 2026-02-23 08:58:39.21447974 +0000 UTC m=+0.376299000 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z)
Feb 23 08:58:39 np0005626463.localdomain podman[106384]: 2026-02-23 08:58:39.225414052 +0000 UTC m=+0.389215823 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:58:39 np0005626463.localdomain podman[106404]: 2026-02-23 08:58:39.235323721 +0000 UTC m=+0.378878630 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:58:39 np0005626463.localdomain podman[106385]: 2026-02-23 08:58:39.25032811 +0000 UTC m=+0.412147380 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:58:39 np0005626463.localdomain podman[106411]: 2026-02-23 08:58:39.309146628 +0000 UTC m=+0.442842878 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, release=1766032510, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr)
Feb 23 08:58:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9319 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD839920000000001030307) 
Feb 23 08:58:39 np0005626463.localdomain podman[106411]: 2026-02-23 08:58:39.529338588 +0000 UTC m=+0.663034858 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 08:58:39 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:58:40 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=431 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD83D060000000001030307) 
Feb 23 08:58:40 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9320 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD83D860000000001030307) 
Feb 23 08:58:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:58:40 np0005626463.localdomain podman[106557]: 2026-02-23 08:58:40.913091515 +0000 UTC m=+0.083106768 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:58:41 np0005626463.localdomain podman[106557]: 2026-02-23 08:58:41.330822398 +0000 UTC m=+0.500837661 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 23 08:58:41 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:58:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9321 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD845860000000001030307) 
Feb 23 08:58:43 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2999 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD849300000000001030307) 
Feb 23 08:58:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3000 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD84D460000000001030307) 
Feb 23 08:58:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62263 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD850060000000001030307) 
Feb 23 08:58:46 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9322 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD855470000000001030307) 
Feb 23 08:58:46 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3001 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD855470000000001030307) 
Feb 23 08:58:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=432 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD85E060000000001030307) 
Feb 23 08:58:50 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3002 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD865060000000001030307) 
Feb 23 08:58:53 np0005626463.localdomain sshd[106581]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:58:53 np0005626463.localdomain sshd[106581]: Accepted publickey for zuul from 192.168.122.30 port 55972 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 08:58:53 np0005626463.localdomain systemd-logind[759]: New session 36 of user zuul.
Feb 23 08:58:53 np0005626463.localdomain systemd[1]: Started Session 36 of User zuul.
Feb 23 08:58:53 np0005626463.localdomain sshd[106581]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 08:58:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27699 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8730F0000000001030307) 
Feb 23 08:58:54 np0005626463.localdomain sudo[106674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjbcbmexxxfbfomzvivrauwwrsunbcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837133.8644934-21-75707095863186/AnsiballZ_stat.py
Feb 23 08:58:54 np0005626463.localdomain sudo[106674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:58:54 np0005626463.localdomain python3.9[106676]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 08:58:54 np0005626463.localdomain sudo[106674]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9323 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD876060000000001030307) 
Feb 23 08:58:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27700 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD877060000000001030307) 
Feb 23 08:58:55 np0005626463.localdomain sudo[106768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cythwqfgvetioxpimayuvtbilmatkqos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837134.719411-57-39596369821422/AnsiballZ_command.py
Feb 23 08:58:55 np0005626463.localdomain sudo[106768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:58:55 np0005626463.localdomain python3.9[106770]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:58:55 np0005626463.localdomain sudo[106768]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:55 np0005626463.localdomain sudo[106861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxdrmjeipkhjftjovsgmsuvcgdqezetv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837135.6463594-81-4493383421671/AnsiballZ_stat.py
Feb 23 08:58:55 np0005626463.localdomain sudo[106861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:58:56 np0005626463.localdomain python3.9[106863]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 08:58:56 np0005626463.localdomain sudo[106861]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:56 np0005626463.localdomain sudo[106955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkbuvdyudfjbtkxqgptrmwisvclaltbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837136.3026512-105-85709822808741/AnsiballZ_command.py
Feb 23 08:58:56 np0005626463.localdomain sudo[106955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:58:56 np0005626463.localdomain python3.9[106957]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:58:56 np0005626463.localdomain sudo[106955]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27701 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD87F060000000001030307) 
Feb 23 08:58:57 np0005626463.localdomain sudo[107048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzgyejygdficnpmkchgbgtedzmexaxrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837137.0870852-132-141483365199252/AnsiballZ_command.py
Feb 23 08:58:57 np0005626463.localdomain sudo[107048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:58:57 np0005626463.localdomain python3.9[107050]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 08:58:57 np0005626463.localdomain sudo[107048]: pam_unix(sudo:session): session closed for user root
Feb 23 08:58:58 np0005626463.localdomain python3.9[107141]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 23 08:58:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3003 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD886060000000001030307) 
Feb 23 08:58:59 np0005626463.localdomain sshd[107156]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:58:59 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21908 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD889850000000001030307) 
Feb 23 08:58:59 np0005626463.localdomain sshd[107156]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:59:00 np0005626463.localdomain python3.9[107233]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 08:59:00 np0005626463.localdomain python3.9[107325]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 23 08:59:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21909 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD88D860000000001030307) 
Feb 23 08:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:59:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:59:01 np0005626463.localdomain python3.9[107415]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 08:59:01 np0005626463.localdomain systemd[1]: tmp-crun.xulUhf.mount: Deactivated successfully.
Feb 23 08:59:01 np0005626463.localdomain podman[107416]: 2026-02-23 08:59:01.937929132 +0000 UTC m=+0.109271096 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, vcs-type=git)
Feb 23 08:59:01 np0005626463.localdomain podman[107416]: 2026-02-23 08:59:01.953357974 +0000 UTC m=+0.124699918 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git)
Feb 23 08:59:01 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:59:02 np0005626463.localdomain python3.9[107483]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 08:59:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21910 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD895860000000001030307) 
Feb 23 08:59:03 np0005626463.localdomain sshd[106581]: pam_unix(sshd:session): session closed for user zuul
Feb 23 08:59:03 np0005626463.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Feb 23 08:59:03 np0005626463.localdomain systemd[1]: session-36.scope: Consumed 5.012s CPU time.
Feb 23 08:59:03 np0005626463.localdomain systemd-logind[759]: Session 36 logged out. Waiting for processes to exit.
Feb 23 08:59:03 np0005626463.localdomain systemd-logind[759]: Removed session 36.
Feb 23 08:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 08:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 08:59:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2465 DF PROTO=TCP SPT=58868 DPT=9101 SEQ=2655477868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8A2860000000001030307) 
Feb 23 08:59:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20693 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8AEC30000000001030307) 
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:59:09 np0005626463.localdomain podman[107501]: 2026-02-23 08:59:09.926308541 +0000 UTC m=+0.092877533 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: tmp-crun.rcgNOW.mount: Deactivated successfully.
Feb 23 08:59:09 np0005626463.localdomain podman[107528]: 2026-02-23 08:59:09.955140022 +0000 UTC m=+0.101449291 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd)
Feb 23 08:59:09 np0005626463.localdomain systemd[1]: tmp-crun.GeTEWM.mount: Deactivated successfully.
Feb 23 08:59:09 np0005626463.localdomain podman[107526]: 2026-02-23 08:59:09.998811317 +0000 UTC m=+0.142834664 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 23 08:59:10 np0005626463.localdomain podman[107499]: 2026-02-23 08:59:10.040035484 +0000 UTC m=+0.208419683 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:59:10 np0005626463.localdomain podman[107502]: 2026-02-23 08:59:09.974958311 +0000 UTC m=+0.135558956 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.openshift.expose-services=)
Feb 23 08:59:10 np0005626463.localdomain podman[107500]: 2026-02-23 08:59:10.095286511 +0000 UTC m=+0.261645936 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, architecture=x86_64, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 08:59:10 np0005626463.localdomain podman[107499]: 2026-02-23 08:59:10.107159342 +0000 UTC m=+0.275543601 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 08:59:10 np0005626463.localdomain podman[107499]: unhealthy
Feb 23 08:59:10 np0005626463.localdomain podman[107519]: 2026-02-23 08:59:10.11602495 +0000 UTC m=+0.266242741 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com)
Feb 23 08:59:10 np0005626463.localdomain podman[107526]: 2026-02-23 08:59:10.11798457 +0000 UTC m=+0.262007997 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:59:10 np0005626463.localdomain podman[107526]: unhealthy
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:59:10 np0005626463.localdomain podman[107528]: 2026-02-23 08:59:10.146230773 +0000 UTC m=+0.292540072 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 23 08:59:10 np0005626463.localdomain podman[107500]: 2026-02-23 08:59:10.160612722 +0000 UTC m=+0.326972197 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:59:10 np0005626463.localdomain podman[107519]: 2026-02-23 08:59:10.206472925 +0000 UTC m=+0.356690757 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:59:10 np0005626463.localdomain podman[107508]: 2026-02-23 08:59:10.255066234 +0000 UTC m=+0.407101502 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 23 08:59:10 np0005626463.localdomain podman[107502]: 2026-02-23 08:59:10.263200538 +0000 UTC m=+0.423801213 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 08:59:10 np0005626463.localdomain podman[107502]: unhealthy
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:59:10 np0005626463.localdomain podman[107501]: 2026-02-23 08:59:10.3154287 +0000 UTC m=+0.481997832 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com)
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully.
Feb 23 08:59:10 np0005626463.localdomain podman[107508]: 2026-02-23 08:59:10.365927728 +0000 UTC m=+0.517963036 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 08:59:10 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:59:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:59:11 np0005626463.localdomain podman[107683]: 2026-02-23 08:59:11.933258412 +0000 UTC m=+0.109095170 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, version=17.1.13)
Feb 23 08:59:12 np0005626463.localdomain podman[107683]: 2026-02-23 08:59:12.385797632 +0000 UTC m=+0.561634280 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com)
Feb 23 08:59:12 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:59:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20695 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8BAC60000000001030307) 
Feb 23 08:59:12 np0005626463.localdomain sshd[107707]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:59:13 np0005626463.localdomain sshd[107707]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:59:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3004 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8C6060000000001030307) 
Feb 23 08:59:16 np0005626463.localdomain sshd[107709]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:59:16 np0005626463.localdomain sshd[107709]: Accepted publickey for zuul from 192.168.122.30 port 54552 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 08:59:17 np0005626463.localdomain systemd-logind[759]: New session 37 of user zuul.
Feb 23 08:59:17 np0005626463.localdomain systemd[1]: Started Session 37 of User zuul.
Feb 23 08:59:17 np0005626463.localdomain sshd[107709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 08:59:17 np0005626463.localdomain sudo[107802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhbajzvjhdvlqcotssvwfcekapomunee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837157.0896075-18-108283926194560/AnsiballZ_systemd_service.py
Feb 23 08:59:17 np0005626463.localdomain sudo[107802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:59:18 np0005626463.localdomain python3.9[107804]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 08:59:18 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:59:18 np0005626463.localdomain systemd-rc-local-generator[107827]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:59:18 np0005626463.localdomain systemd-sysv-generator[107833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:59:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:59:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2467 DF PROTO=TCP SPT=58868 DPT=9101 SEQ=2655477868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8D2060000000001030307) 
Feb 23 08:59:18 np0005626463.localdomain sudo[107802]: pam_unix(sudo:session): session closed for user root
Feb 23 08:59:19 np0005626463.localdomain python3.9[107929]: ansible-ansible.builtin.service_facts Invoked
Feb 23 08:59:19 np0005626463.localdomain network[107946]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 08:59:19 np0005626463.localdomain network[107947]: 'network-scripts' will be removed from distribution in near future.
Feb 23 08:59:19 np0005626463.localdomain network[107948]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 08:59:20 np0005626463.localdomain sudo[107969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 08:59:20 np0005626463.localdomain sudo[107969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:59:20 np0005626463.localdomain sudo[107969]: pam_unix(sudo:session): session closed for user root
Feb 23 08:59:20 np0005626463.localdomain sudo[107988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 08:59:20 np0005626463.localdomain sudo[107988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:59:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:59:21 np0005626463.localdomain sudo[107988]: pam_unix(sudo:session): session closed for user root
Feb 23 08:59:21 np0005626463.localdomain sudo[108098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 08:59:21 np0005626463.localdomain sudo[108098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 08:59:21 np0005626463.localdomain sudo[108098]: pam_unix(sudo:session): session closed for user root
Feb 23 08:59:23 np0005626463.localdomain python3.9[108221]: ansible-ansible.builtin.service_facts Invoked
Feb 23 08:59:23 np0005626463.localdomain network[108238]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 08:59:23 np0005626463.localdomain network[108239]: 'network-scripts' will be removed from distribution in near future.
Feb 23 08:59:23 np0005626463.localdomain network[108240]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 08:59:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57791 DF PROTO=TCP SPT=39384 DPT=9102 SEQ=2834932690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8E8400000000001030307) 
Feb 23 08:59:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20697 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8EA070000000001030307) 
Feb 23 08:59:24 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:59:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57793 DF PROTO=TCP SPT=39384 DPT=9102 SEQ=2834932690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8F4460000000001030307) 
Feb 23 08:59:29 np0005626463.localdomain sudo[108437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-betxhhoyqjidoapbsylrcugtlvwykmol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837168.8219957-108-54096271292734/AnsiballZ_systemd_service.py
Feb 23 08:59:29 np0005626463.localdomain sudo[108437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 08:59:29 np0005626463.localdomain python3.9[108439]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 08:59:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 08:59:29 np0005626463.localdomain systemd-sysv-generator[108468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 08:59:29 np0005626463.localdomain systemd-rc-local-generator[108465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 08:59:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 08:59:30 np0005626463.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Feb 23 08:59:30 np0005626463.localdomain systemd[1]: tmp-crun.9smelV.mount: Deactivated successfully.
Feb 23 08:59:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54041 DF PROTO=TCP SPT=51734 DPT=9100 SEQ=1973825206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD902C60000000001030307) 
Feb 23 08:59:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 08:59:32 np0005626463.localdomain systemd[1]: tmp-crun.XhJ3kl.mount: Deactivated successfully.
Feb 23 08:59:32 np0005626463.localdomain podman[108494]: 2026-02-23 08:59:32.17192037 +0000 UTC m=+0.093625837 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container)
Feb 23 08:59:32 np0005626463.localdomain podman[108494]: 2026-02-23 08:59:32.213647634 +0000 UTC m=+0.135353061 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 08:59:32 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 08:59:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54042 DF PROTO=TCP SPT=51734 DPT=9100 SEQ=1973825206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD90AC60000000001030307) 
Feb 23 08:59:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62890 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=566802748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD917860000000001030307) 
Feb 23 08:59:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16383 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD923F30000000001030307) 
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 08:59:40 np0005626463.localdomain podman[108545]: Error: container 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 is not running
Feb 23 08:59:40 np0005626463.localdomain podman[108516]: 2026-02-23 08:59:40.423820355 +0000 UTC m=+0.093608905 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=125/n/a
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'.
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: tmp-crun.avrdYm.mount: Deactivated successfully.
Feb 23 08:59:40 np0005626463.localdomain podman[108521]: 2026-02-23 08:59:40.475302085 +0000 UTC m=+0.135633810 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:59:40 np0005626463.localdomain podman[108516]: 2026-02-23 08:59:40.490588312 +0000 UTC m=+0.160376862 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Feb 23 08:59:40 np0005626463.localdomain podman[108516]: unhealthy
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 08:59:40 np0005626463.localdomain podman[108530]: 2026-02-23 08:59:40.544967722 +0000 UTC m=+0.197352199 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Feb 23 08:59:40 np0005626463.localdomain podman[108525]: 2026-02-23 08:59:40.502701191 +0000 UTC m=+0.154293853 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 08:59:40 np0005626463.localdomain podman[108525]: 2026-02-23 08:59:40.585197918 +0000 UTC m=+0.236790540 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.expose-services=, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Feb 23 08:59:40 np0005626463.localdomain podman[108525]: unhealthy
Feb 23 08:59:40 np0005626463.localdomain podman[108518]: 2026-02-23 08:59:40.595955794 +0000 UTC m=+0.256261768 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 08:59:40 np0005626463.localdomain podman[108521]: 2026-02-23 08:59:40.605853073 +0000 UTC m=+0.266184818 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 08:59:40 np0005626463.localdomain podman[108518]: 2026-02-23 08:59:40.613225494 +0000 UTC m=+0.273531398 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 08:59:40 np0005626463.localdomain podman[108518]: unhealthy
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 08:59:40 np0005626463.localdomain podman[108593]: 2026-02-23 08:59:40.563342225 +0000 UTC m=+0.123743937 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.5)
Feb 23 08:59:40 np0005626463.localdomain podman[108517]: 2026-02-23 08:59:40.673405294 +0000 UTC m=+0.338758465 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 23 08:59:40 np0005626463.localdomain podman[108517]: 2026-02-23 08:59:40.68736869 +0000 UTC m=+0.352721951 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 08:59:40 np0005626463.localdomain podman[108593]: 2026-02-23 08:59:40.742902046 +0000 UTC m=+0.303303838 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 08:59:40 np0005626463.localdomain podman[108530]: 2026-02-23 08:59:40.780174901 +0000 UTC m=+0.432559388 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:14Z, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:59:40 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 08:59:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16385 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD930060000000001030307) 
Feb 23 08:59:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 08:59:42 np0005626463.localdomain podman[108682]: 2026-02-23 08:59:42.665071347 +0000 UTC m=+0.089810717 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 23 08:59:43 np0005626463.localdomain podman[108682]: 2026-02-23 08:59:43.078533036 +0000 UTC m=+0.503272366 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 08:59:43 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 08:59:43 np0005626463.localdomain sshd[108705]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:59:44 np0005626463.localdomain sshd[108705]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:59:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18578 DF PROTO=TCP SPT=44066 DPT=9105 SEQ=27882988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD93A060000000001030307) 
Feb 23 08:59:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62892 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=566802748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD948070000000001030307) 
Feb 23 08:59:53 np0005626463.localdomain sshd[108707]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 08:59:53 np0005626463.localdomain sshd[108707]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 08:59:53 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 08:59:53 np0005626463.localdomain recover_tripleo_nova_virtqemud[108710]: 61982
Feb 23 08:59:53 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 08:59:53 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 08:59:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31920 DF PROTO=TCP SPT=38510 DPT=9102 SEQ=3709768302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD95D700000000001030307) 
Feb 23 08:59:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16387 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD960070000000001030307) 
Feb 23 08:59:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31922 DF PROTO=TCP SPT=38510 DPT=9102 SEQ=3709768302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD969870000000001030307) 
Feb 23 09:00:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14393 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD978060000000001030307) 
Feb 23 09:00:01 np0005626463.localdomain CROND[108712]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Feb 23 09:00:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 09:00:02 np0005626463.localdomain podman[108715]: 2026-02-23 09:00:02.667557266 +0000 UTC m=+0.090628224 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd)
Feb 23 09:00:02 np0005626463.localdomain podman[108715]: 2026-02-23 09:00:02.707334648 +0000 UTC m=+0.130405556 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Feb 23 09:00:02 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 09:00:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14394 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD980060000000001030307) 
Feb 23 09:00:05 np0005626463.localdomain CROND[108711]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Feb 23 09:00:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29966 DF PROTO=TCP SPT=54134 DPT=9101 SEQ=47187795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD98CC70000000001030307) 
Feb 23 09:00:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5816 DF PROTO=TCP SPT=50952 DPT=9882 SEQ=2048617207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD999230000000001030307) 
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:00:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 09:00:10 np0005626463.localdomain podman[108737]: 2026-02-23 09:00:10.937949098 +0000 UTC m=+0.105589901 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 09:00:10 np0005626463.localdomain podman[108738]: 2026-02-23 09:00:10.948105285 +0000 UTC m=+0.111448793 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public)
Feb 23 09:00:11 np0005626463.localdomain podman[108739]: Error: container 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 is not running
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=125/n/a
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'.
Feb 23 09:00:11 np0005626463.localdomain podman[108758]: 2026-02-23 09:00:10.995273289 +0000 UTC m=+0.136923930 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:00:11 np0005626463.localdomain podman[108763]: 2026-02-23 09:00:11.064535593 +0000 UTC m=+0.205756140 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 23 09:00:11 np0005626463.localdomain podman[108740]: 2026-02-23 09:00:11.000155141 +0000 UTC m=+0.155355634 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=)
Feb 23 09:00:11 np0005626463.localdomain podman[108752]: 2026-02-23 09:00:11.121539144 +0000 UTC m=+0.267625623 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:00:11 np0005626463.localdomain podman[108752]: 2026-02-23 09:00:11.131354991 +0000 UTC m=+0.277441490 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 09:00:11 np0005626463.localdomain podman[108738]: 2026-02-23 09:00:11.138492623 +0000 UTC m=+0.301836112 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=iscsid, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 09:00:11 np0005626463.localdomain podman[108758]: 2026-02-23 09:00:11.182022864 +0000 UTC m=+0.323673515 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 23 09:00:11 np0005626463.localdomain podman[108758]: unhealthy
Feb 23 09:00:11 np0005626463.localdomain podman[108740]: 2026-02-23 09:00:11.191111618 +0000 UTC m=+0.346312171 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public)
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 09:00:11 np0005626463.localdomain podman[108740]: unhealthy
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:00:11 np0005626463.localdomain podman[108737]: 2026-02-23 09:00:11.235795084 +0000 UTC m=+0.403435887 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, container_name=ovn_controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible)
Feb 23 09:00:11 np0005626463.localdomain podman[108737]: unhealthy
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:00:11 np0005626463.localdomain podman[108763]: 2026-02-23 09:00:11.273179723 +0000 UTC m=+0.414400290 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64)
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 09:00:11 np0005626463.localdomain podman[108746]: 2026-02-23 09:00:11.025802593 +0000 UTC m=+0.166459132 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 09:00:11 np0005626463.localdomain podman[108746]: 2026-02-23 09:00:11.32016466 +0000 UTC m=+0.460821199 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, release=1766032510, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully.
Feb 23 09:00:11 np0005626463.localdomain systemd[1]: tmp-crun.N5WIVy.mount: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain podman[108480]: time="2026-02-23T09:00:12Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Feb 23 09:00:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52448 SEQ=3894642472 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: libpod-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: libpod-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Consumed 6.775s CPU time.
Feb 23 09:00:12 np0005626463.localdomain podman[108480]: 2026-02-23 09:00:12.135788725 +0000 UTC m=+42.109811059 container died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9-userdata-shm.mount: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain podman[108480]: 2026-02-23 09:00:12.194740717 +0000 UTC m=+42.168763051 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 23 09:00:12 np0005626463.localdomain podman[108480]: ceilometer_agent_compute
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: No such file or directory
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory
Feb 23 09:00:12 np0005626463.localdomain podman[108897]: 2026-02-23 09:00:12.222131754 +0000 UTC m=+0.078957908 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: libpod-conmon-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: No such file or directory
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory
Feb 23 09:00:12 np0005626463.localdomain podman[108909]: 2026-02-23 09:00:12.330109388 +0000 UTC m=+0.073720715 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:00:12 np0005626463.localdomain podman[108909]: ceilometer_agent_compute
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.112s CPU time, no IO.
Feb 23 09:00:12 np0005626463.localdomain sudo[108437]: pam_unix(sudo:session): session closed for user root
Feb 23 09:00:12 np0005626463.localdomain sudo[109011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkemyzuodamtqbcehtesyxhhsbjbgvjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837212.5470066-108-135991555095632/AnsiballZ_systemd_service.py
Feb 23 09:00:12 np0005626463.localdomain sudo[109011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:00:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-85b58d6db47c08b0dc415e7676af05b270f66bddea6d8ca4f2d3998d7b04080d-merged.mount: Deactivated successfully.
Feb 23 09:00:13 np0005626463.localdomain python3.9[109013]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:00:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:00:13 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:00:13 np0005626463.localdomain podman[109015]: 2026-02-23 09:00:13.26594923 +0000 UTC m=+0.070052170 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 23 09:00:13 np0005626463.localdomain systemd-rc-local-generator[109064]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:00:13 np0005626463.localdomain systemd-sysv-generator[109067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:00:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:00:13 np0005626463.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Feb 23 09:00:13 np0005626463.localdomain podman[109015]: 2026-02-23 09:00:13.696586076 +0000 UTC m=+0.500688986 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true)
Feb 23 09:00:13 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 09:00:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14396 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9B0060000000001030307) 
Feb 23 09:00:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29968 DF PROTO=TCP SPT=54134 DPT=9101 SEQ=47187795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9BC070000000001030307) 
Feb 23 09:00:21 np0005626463.localdomain sudo[109090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:00:21 np0005626463.localdomain sudo[109090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:00:21 np0005626463.localdomain sudo[109090]: pam_unix(sudo:session): session closed for user root
Feb 23 09:00:21 np0005626463.localdomain sudo[109105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:00:21 np0005626463.localdomain sudo[109105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:00:22 np0005626463.localdomain sudo[109105]: pam_unix(sudo:session): session closed for user root
Feb 23 09:00:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45875 DF PROTO=TCP SPT=33512 DPT=9102 SEQ=461622767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9D2A00000000001030307) 
Feb 23 09:00:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5820 DF PROTO=TCP SPT=50952 DPT=9882 SEQ=2048617207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9D6070000000001030307) 
Feb 23 09:00:25 np0005626463.localdomain sudo[109153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:00:25 np0005626463.localdomain sudo[109153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:00:25 np0005626463.localdomain sudo[109153]: pam_unix(sudo:session): session closed for user root
Feb 23 09:00:26 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52448 SEQ=3894642472 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 23 09:00:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25464 DF PROTO=TCP SPT=55642 DPT=9100 SEQ=2139831579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9ED070000000001030307) 
Feb 23 09:00:32 np0005626463.localdomain sshd[109168]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:00:32 np0005626463.localdomain sshd[109168]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:00:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 09:00:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25465 DF PROTO=TCP SPT=55642 DPT=9100 SEQ=2139831579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9F5060000000001030307) 
Feb 23 09:00:32 np0005626463.localdomain systemd[1]: tmp-crun.fq3DkZ.mount: Deactivated successfully.
Feb 23 09:00:32 np0005626463.localdomain podman[109170]: 2026-02-23 09:00:32.921328364 +0000 UTC m=+0.102323605 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z)
Feb 23 09:00:32 np0005626463.localdomain podman[109170]: 2026-02-23 09:00:32.934532975 +0000 UTC m=+0.115528236 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3)
Feb 23 09:00:32 np0005626463.localdomain sshd[109187]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:00:32 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully.
Feb 23 09:00:35 np0005626463.localdomain sshd[109187]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:00:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39575 DF PROTO=TCP SPT=53718 DPT=9101 SEQ=1764346632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA02070000000001030307) 
Feb 23 09:00:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45879 DF PROTO=TCP SPT=33512 DPT=9102 SEQ=461622767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA0E060000000001030307) 
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 09:00:41 np0005626463.localdomain podman[109194]: Error: container 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 is not running
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Main process exited, code=exited, status=125/n/a
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed with result 'exit-code'.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: tmp-crun.UHE6cs.mount: Deactivated successfully.
Feb 23 09:00:41 np0005626463.localdomain podman[109192]: 2026-02-23 09:00:41.697086221 +0000 UTC m=+0.105676839 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 23 09:00:41 np0005626463.localdomain podman[109192]: 2026-02-23 09:00:41.736325452 +0000 UTC m=+0.144916100 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team)
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: tmp-crun.PUvHOt.mount: Deactivated successfully.
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 09:00:41 np0005626463.localdomain podman[109191]: 2026-02-23 09:00:41.785581354 +0000 UTC m=+0.195568335 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Feb 23 09:00:41 np0005626463.localdomain podman[109195]: 2026-02-23 09:00:41.741808772 +0000 UTC m=+0.137340594 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 23 09:00:41 np0005626463.localdomain podman[109193]: 2026-02-23 09:00:41.806376232 +0000 UTC m=+0.201428529 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, version=17.1.13, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 23 09:00:41 np0005626463.localdomain podman[109193]: 2026-02-23 09:00:41.847969596 +0000 UTC m=+0.243021893 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true)
Feb 23 09:00:41 np0005626463.localdomain podman[109193]: unhealthy
Feb 23 09:00:41 np0005626463.localdomain podman[109212]: 2026-02-23 09:00:41.859096512 +0000 UTC m=+0.250930249 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public)
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:00:41 np0005626463.localdomain podman[109191]: 2026-02-23 09:00:41.86835578 +0000 UTC m=+0.278342751 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64)
Feb 23 09:00:41 np0005626463.localdomain podman[109191]: unhealthy
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:00:41 np0005626463.localdomain podman[109207]: 2026-02-23 09:00:41.912680979 +0000 UTC m=+0.298433807 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 09:00:41 np0005626463.localdomain podman[109195]: 2026-02-23 09:00:41.930120692 +0000 UTC m=+0.325652544 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron)
Feb 23 09:00:41 np0005626463.localdomain podman[109207]: 2026-02-23 09:00:41.93745078 +0000 UTC m=+0.323203648 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 09:00:41 np0005626463.localdomain podman[109207]: unhealthy
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:00:41 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 09:00:42 np0005626463.localdomain podman[109212]: 2026-02-23 09:00:42.118494713 +0000 UTC m=+0.510328430 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 23 09:00:42 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 09:00:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36339 DF PROTO=TCP SPT=49580 DPT=9882 SEQ=53511426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA1A460000000001030307) 
Feb 23 09:00:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:00:43 np0005626463.localdomain systemd[1]: tmp-crun.VlgfwV.mount: Deactivated successfully.
Feb 23 09:00:43 np0005626463.localdomain podman[109326]: 2026-02-23 09:00:43.916860859 +0000 UTC m=+0.093885162 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com)
Feb 23 09:00:44 np0005626463.localdomain podman[109326]: 2026-02-23 09:00:44.306598735 +0000 UTC m=+0.483622998 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 09:00:44 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 09:00:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59359 DF PROTO=TCP SPT=35460 DPT=9105 SEQ=2680018870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA24060000000001030307) 
Feb 23 09:00:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39577 DF PROTO=TCP SPT=53718 DPT=9101 SEQ=1764346632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA32060000000001030307) 
Feb 23 09:00:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11700 DF PROTO=TCP SPT=32960 DPT=9102 SEQ=195551821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA47D00000000001030307) 
Feb 23 09:00:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36341 DF PROTO=TCP SPT=49580 DPT=9882 SEQ=53511426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA4A070000000001030307) 
Feb 23 09:00:55 np0005626463.localdomain podman[109076]: time="2026-02-23T09:00:55Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: tmp-crun.bEolY8.mount: Deactivated successfully.
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: libpod-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Deactivated successfully.
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: libpod-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Consumed 7.261s CPU time.
Feb 23 09:00:55 np0005626463.localdomain podman[109076]: 2026-02-23 09:00:55.789167514 +0000 UTC m=+42.111801170 container died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Deactivated successfully.
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory
Feb 23 09:00:55 np0005626463.localdomain podman[109076]: 2026-02-23 09:00:55.846430526 +0000 UTC m=+42.169064182 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 23 09:00:55 np0005626463.localdomain podman[109076]: ceilometer_agent_ipmi
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: No such file or directory
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory
Feb 23 09:00:55 np0005626463.localdomain podman[109350]: 2026-02-23 09:00:55.890775575 +0000 UTC m=+0.082811687 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: libpod-conmon-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Deactivated successfully.
Feb 23 09:00:55 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: No such file or directory
Feb 23 09:00:56 np0005626463.localdomain systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory
Feb 23 09:00:56 np0005626463.localdomain podman[109364]: 2026-02-23 09:00:56.001056917 +0000 UTC m=+0.076627186 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:00:56 np0005626463.localdomain podman[109364]: ceilometer_agent_ipmi
Feb 23 09:00:56 np0005626463.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Feb 23 09:00:56 np0005626463.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Feb 23 09:00:56 np0005626463.localdomain sudo[109011]: pam_unix(sudo:session): session closed for user root
Feb 23 09:00:56 np0005626463.localdomain sudo[109464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voappsnubzvjrylsavsrzohlroloufzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837256.1993139-108-190282453174439/AnsiballZ_systemd_service.py
Feb 23 09:00:56 np0005626463.localdomain sudo[109464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:00:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc-merged.mount: Deactivated successfully.
Feb 23 09:00:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950-userdata-shm.mount: Deactivated successfully.
Feb 23 09:00:56 np0005626463.localdomain python3.9[109466]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:00:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11702 DF PROTO=TCP SPT=32960 DPT=9102 SEQ=195551821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA53C70000000001030307) 
Feb 23 09:00:57 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:00:58 np0005626463.localdomain systemd-rc-local-generator[109489]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:00:58 np0005626463.localdomain systemd-sysv-generator[109492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:00:58 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:00:58 np0005626463.localdomain systemd[1]: Stopping collectd container...
Feb 23 09:01:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58829 DF PROTO=TCP SPT=59464 DPT=9100 SEQ=1379809827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA62460000000001030307) 
Feb 23 09:01:01 np0005626463.localdomain CROND[109521]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 09:01:01 np0005626463.localdomain run-parts[109524]: (/etc/cron.hourly) starting 0anacron
Feb 23 09:01:01 np0005626463.localdomain anacron[109532]: Anacron started on 2026-02-23
Feb 23 09:01:01 np0005626463.localdomain anacron[109532]: Will run job `cron.daily' in 30 min.
Feb 23 09:01:01 np0005626463.localdomain anacron[109532]: Will run job `cron.weekly' in 50 min.
Feb 23 09:01:01 np0005626463.localdomain anacron[109532]: Will run job `cron.monthly' in 70 min.
Feb 23 09:01:01 np0005626463.localdomain anacron[109532]: Jobs will be executed sequentially
Feb 23 09:01:01 np0005626463.localdomain run-parts[109534]: (/etc/cron.hourly) finished 0anacron
Feb 23 09:01:01 np0005626463.localdomain CROND[109520]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 09:01:01 np0005626463.localdomain CROND[109536]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 09:01:01 np0005626463.localdomain run-parts[109539]: (/etc/cron.hourly) starting 0anacron
Feb 23 09:01:01 np0005626463.localdomain run-parts[109545]: (/etc/cron.hourly) finished 0anacron
Feb 23 09:01:01 np0005626463.localdomain CROND[109535]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 09:01:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58830 DF PROTO=TCP SPT=59464 DPT=9100 SEQ=1379809827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA6A460000000001030307) 
Feb 23 09:01:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 09:01:03 np0005626463.localdomain podman[109546]: Error: container 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 is not running
Feb 23 09:01:03 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=125/n/a
Feb 23 09:01:03 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'.
Feb 23 09:01:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30645 DF PROTO=TCP SPT=36270 DPT=9101 SEQ=807460934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA77460000000001030307) 
Feb 23 09:01:07 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 09:01:07 np0005626463.localdomain recover_tripleo_nova_virtqemud[109559]: 61982
Feb 23 09:01:07 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 09:01:07 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 09:01:08 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: tmp-crun.d6dPC1.mount: Deactivated successfully.
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:01:11 np0005626463.localdomain podman[109560]: 2026-02-23 09:01:11.937992683 +0000 UTC m=+0.108506968 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 09:01:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:01:11 np0005626463.localdomain podman[109560]: 2026-02-23 09:01:11.983403015 +0000 UTC m=+0.153917240 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13)
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully.
Feb 23 09:01:12 np0005626463.localdomain podman[109580]: 2026-02-23 09:01:12.039380317 +0000 UTC m=+0.085694538 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, version=17.1.13, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 09:01:12 np0005626463.localdomain podman[109580]: 2026-02-23 09:01:12.087595827 +0000 UTC m=+0.133910038 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, container_name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 23 09:01:12 np0005626463.localdomain podman[109580]: unhealthy
Feb 23 09:01:12 np0005626463.localdomain podman[109579]: 2026-02-23 09:01:12.096406281 +0000 UTC m=+0.146339694 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13)
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:01:12 np0005626463.localdomain podman[109579]: 2026-02-23 09:01:12.114445293 +0000 UTC m=+0.164378656 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 09:01:12 np0005626463.localdomain podman[109587]: 2026-02-23 09:01:12.167038559 +0000 UTC m=+0.205073082 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, release=1766032510)
Feb 23 09:01:12 np0005626463.localdomain podman[109579]: unhealthy
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:01:12 np0005626463.localdomain podman[109587]: 2026-02-23 09:01:12.254452369 +0000 UTC m=+0.292486852 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 09:01:12 np0005626463.localdomain podman[109593]: 2026-02-23 09:01:12.304249008 +0000 UTC m=+0.335580912 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 23 09:01:12 np0005626463.localdomain podman[109593]: 2026-02-23 09:01:12.327308326 +0000 UTC m=+0.358640240 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 09:01:12 np0005626463.localdomain sshd[109686]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:01:12 np0005626463.localdomain podman[109593]: unhealthy
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 09:01:12 np0005626463.localdomain podman[109647]: 2026-02-23 09:01:12.255743219 +0000 UTC m=+0.078028509 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1)
Feb 23 09:01:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33856 DF PROTO=TCP SPT=34684 DPT=9882 SEQ=912446976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA8F860000000001030307) 
Feb 23 09:01:12 np0005626463.localdomain podman[109647]: 2026-02-23 09:01:12.52347397 +0000 UTC m=+0.345759270 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, container_name=metrics_qdr)
Feb 23 09:01:12 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 09:01:12 np0005626463.localdomain sshd[109686]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:01:12 np0005626463.localdomain sshd[109688]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:01:13 np0005626463.localdomain sshd[109688]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:01:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:01:14 np0005626463.localdomain podman[109690]: 2026-02-23 09:01:14.920061509 +0000 UTC m=+0.087289707 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Feb 23 09:01:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53492 DF PROTO=TCP SPT=44140 DPT=9105 SEQ=3525360412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA9A060000000001030307) 
Feb 23 09:01:15 np0005626463.localdomain podman[109690]: 2026-02-23 09:01:15.341343938 +0000 UTC m=+0.508572156 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 09:01:15 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 09:01:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30647 DF PROTO=TCP SPT=36270 DPT=9101 SEQ=807460934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAA8060000000001030307) 
Feb 23 09:01:22 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 23 09:01:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2242 DF PROTO=TCP SPT=49472 DPT=9102 SEQ=2622482458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDABD000000000001030307) 
Feb 23 09:01:25 np0005626463.localdomain sudo[109714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:01:25 np0005626463.localdomain sudo[109714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:01:25 np0005626463.localdomain sudo[109714]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:25 np0005626463.localdomain sudo[109729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:01:25 np0005626463.localdomain sudo[109729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:01:26 np0005626463.localdomain sudo[109729]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:26 np0005626463.localdomain sudo[109764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:01:26 np0005626463.localdomain sudo[109764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:01:26 np0005626463.localdomain sudo[109764]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:26 np0005626463.localdomain sudo[109779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:01:26 np0005626463.localdomain sudo[109779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:01:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2244 DF PROTO=TCP SPT=49472 DPT=9102 SEQ=2622482458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAC9060000000001030307) 
Feb 23 09:01:27 np0005626463.localdomain sudo[109779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:27 np0005626463.localdomain sudo[109825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:01:27 np0005626463.localdomain sudo[109825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:01:27 np0005626463.localdomain sudo[109825]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56182 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2807125067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAD7870000000001030307) 
Feb 23 09:01:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56183 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2807125067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDADF860000000001030307) 
Feb 23 09:01:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 09:01:33 np0005626463.localdomain podman[109840]: Error: container 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 is not running
Feb 23 09:01:33 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=125/n/a
Feb 23 09:01:33 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'.
Feb 23 09:01:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3111 DF PROTO=TCP SPT=52292 DPT=9101 SEQ=322994156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAEC460000000001030307) 
Feb 23 09:01:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1999 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAF8B30000000001030307) 
Feb 23 09:01:40 np0005626463.localdomain podman[109506]: time="2026-02-23T09:01:40Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL"
Feb 23 09:01:40 np0005626463.localdomain podman[109506]: 2026-02-23 09:01:40.342344401 +0000 UTC m=+42.068250800 container stop 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: libpod-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Deactivated successfully.
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: libpod-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Consumed 2.685s CPU time.
Feb 23 09:01:40 np0005626463.localdomain podman[109506]: 2026-02-23 09:01:40.374199011 +0000 UTC m=+42.100105380 container died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:10:15Z, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Deactivated successfully.
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain podman[109506]: 2026-02-23 09:01:40.480799629 +0000 UTC m=+42.206705978 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, distribution-scope=public, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 09:01:40 np0005626463.localdomain podman[109506]: collectd
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain podman[109853]: 2026-02-23 09:01:40.513359602 +0000 UTC m=+0.153980613 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: libpod-conmon-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Deactivated successfully.
Feb 23 09:01:40 np0005626463.localdomain podman[109884]: error opening file `/run/crun/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759/status`: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory
Feb 23 09:01:40 np0005626463.localdomain podman[109871]: 2026-02-23 09:01:40.630692342 +0000 UTC m=+0.080443504 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Feb 23 09:01:40 np0005626463.localdomain podman[109871]: collectd
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: tripleo_collectd.service: Deactivated successfully.
Feb 23 09:01:40 np0005626463.localdomain systemd[1]: Stopped collectd container.
Feb 23 09:01:40 np0005626463.localdomain sudo[109464]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:41 np0005626463.localdomain sudo[109975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fblwjpnzlfmjntfsoglhupfyblbfcpdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837300.803665-108-110829413275177/AnsiballZ_systemd_service.py
Feb 23 09:01:41 np0005626463.localdomain sudo[109975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46-merged.mount: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759-userdata-shm.mount: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain python3.9[109977]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:01:41 np0005626463.localdomain systemd-sysv-generator[110009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:01:41 np0005626463.localdomain systemd-rc-local-generator[110006]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: Stopping iscsid container...
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: libpod-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: libpod-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Consumed 1.258s CPU time.
Feb 23 09:01:41 np0005626463.localdomain podman[110017]: 2026-02-23 09:01:41.945646637 +0000 UTC m=+0.083385475 container died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: tmp-crun.AIUzzI.mount: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f-userdata-shm.mount: Deactivated successfully.
Feb 23 09:01:41 np0005626463.localdomain podman[110017]: 2026-02-23 09:01:41.997511821 +0000 UTC m=+0.135250639 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 23 09:01:41 np0005626463.localdomain podman[110017]: iscsid
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: No such file or directory
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory
Feb 23 09:01:42 np0005626463.localdomain podman[110031]: 2026-02-23 09:01:42.04248917 +0000 UTC m=+0.081099804 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public)
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: libpod-conmon-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Deactivated successfully.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: No such file or directory
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory
Feb 23 09:01:42 np0005626463.localdomain podman[110046]: 2026-02-23 09:01:42.157158708 +0000 UTC m=+0.081601119 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:01:42 np0005626463.localdomain podman[110046]: iscsid
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Stopped iscsid container.
Feb 23 09:01:42 np0005626463.localdomain sudo[109975]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:42 np0005626463.localdomain podman[110058]: 2026-02-23 09:01:42.249843212 +0000 UTC m=+0.091991993 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:01:42 np0005626463.localdomain podman[110058]: 2026-02-23 09:01:42.265954264 +0000 UTC m=+0.108103045 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com)
Feb 23 09:01:42 np0005626463.localdomain podman[110058]: unhealthy
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067-merged.mount: Deactivated successfully.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:01:42 np0005626463.localdomain podman[110094]: 2026-02-23 09:01:42.463905052 +0000 UTC m=+0.165710017 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond)
Feb 23 09:01:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2001 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB04C60000000001030307) 
Feb 23 09:01:42 np0005626463.localdomain podman[110092]: 2026-02-23 09:01:42.412649358 +0000 UTC m=+0.117856658 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public)
Feb 23 09:01:42 np0005626463.localdomain podman[110156]: 2026-02-23 09:01:42.532775976 +0000 UTC m=+0.111584524 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13)
Feb 23 09:01:42 np0005626463.localdomain podman[110092]: 2026-02-23 09:01:42.546361558 +0000 UTC m=+0.251568828 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5)
Feb 23 09:01:42 np0005626463.localdomain podman[110092]: unhealthy
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:01:42 np0005626463.localdomain podman[110156]: 2026-02-23 09:01:42.585397983 +0000 UTC m=+0.164206581 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:01:42 np0005626463.localdomain podman[110156]: unhealthy
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 09:01:42 np0005626463.localdomain podman[110094]: 2026-02-23 09:01:42.604171267 +0000 UTC m=+0.305976212 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, container_name=logrotate_crond, vcs-type=git)
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully.
Feb 23 09:01:42 np0005626463.localdomain podman[110199]: 2026-02-23 09:01:42.688300345 +0000 UTC m=+0.106703892 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, tcib_managed=true, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 23 09:01:42 np0005626463.localdomain sudo[110257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldhvlzeyninrrwhnjkjzmzfkdnjdfmzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837302.3496974-108-42353339766497/AnsiballZ_systemd_service.py
Feb 23 09:01:42 np0005626463.localdomain sudo[110257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:42 np0005626463.localdomain podman[110199]: 2026-02-23 09:01:42.892545949 +0000 UTC m=+0.310949456 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 09:01:42 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully.
Feb 23 09:01:43 np0005626463.localdomain python3.9[110260]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:01:43 np0005626463.localdomain systemd-sysv-generator[110292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:01:43 np0005626463.localdomain systemd-rc-local-generator[110284]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: Stopping logrotate_crond container...
Feb 23 09:01:43 np0005626463.localdomain crond[70659]: (CRON) INFO (Shutting down)
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: libpod-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Deactivated successfully.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: libpod-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Consumed 1.175s CPU time.
Feb 23 09:01:43 np0005626463.localdomain podman[110300]: 2026-02-23 09:01:43.536991161 +0000 UTC m=+0.079844275 container died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Deactivated successfully.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain podman[110300]: 2026-02-23 09:01:43.595507562 +0000 UTC m=+0.138360656 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public)
Feb 23 09:01:43 np0005626463.localdomain podman[110300]: logrotate_crond
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain podman[110313]: 2026-02-23 09:01:43.639678557 +0000 UTC m=+0.085760289 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: libpod-conmon-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Deactivated successfully.
Feb 23 09:01:43 np0005626463.localdomain podman[110342]: error opening file `/run/crun/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5/status`: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory
Feb 23 09:01:43 np0005626463.localdomain podman[110329]: 2026-02-23 09:01:43.760797965 +0000 UTC m=+0.082261310 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 23 09:01:43 np0005626463.localdomain podman[110329]: logrotate_crond
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Feb 23 09:01:43 np0005626463.localdomain systemd[1]: Stopped logrotate_crond container.
Feb 23 09:01:43 np0005626463.localdomain sudo[110257]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:44 np0005626463.localdomain sudo[110433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zavjzptkrcgseblnqrcahdzhbfjnsriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837303.95983-108-272295771683027/AnsiballZ_systemd_service.py
Feb 23 09:01:44 np0005626463.localdomain sudo[110433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully.
Feb 23 09:01:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5-userdata-shm.mount: Deactivated successfully.
Feb 23 09:01:44 np0005626463.localdomain python3.9[110435]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:44 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:01:44 np0005626463.localdomain systemd-rc-local-generator[110464]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:01:44 np0005626463.localdomain systemd-sysv-generator[110468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:01:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Stopping metrics_qdr container...
Feb 23 09:01:45 np0005626463.localdomain kernel: qdrouterd[54599]: segfault at 0 ip 00007fd1a6c897cb sp 00007ffd4d99c650 error 4 in libc.so.6[7fd1a6c26000+175000]
Feb 23 09:01:45 np0005626463.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Started Process Core Dump (PID 110489/UID 0).
Feb 23 09:01:45 np0005626463.localdomain systemd-coredump[110490]: Resource limits disable core dumping for process 54599 (qdrouterd).
Feb 23 09:01:45 np0005626463.localdomain systemd-coredump[110490]: Process 54599 (qdrouterd) of user 42465 dumped core.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: systemd-coredump@0-110489-0.service: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain podman[110477]: 2026-02-23 09:01:45.28790388 +0000 UTC m=+0.244261241 container died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5)
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: libpod-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: libpod-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Consumed 31.088s CPU time.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f-userdata-shm.mount: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain podman[110477]: 2026-02-23 09:01:45.345200534 +0000 UTC m=+0.301557865 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:01:45 np0005626463.localdomain podman[110477]: metrics_qdr
Feb 23 09:01:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39641 DF PROTO=TCP SPT=49854 DPT=9105 SEQ=996332164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB10060000000001030307) 
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: No such file or directory
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory
Feb 23 09:01:45 np0005626463.localdomain podman[110494]: 2026-02-23 09:01:45.391343989 +0000 UTC m=+0.091723574 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: libpod-conmon-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Deactivated successfully.
Feb 23 09:01:45 np0005626463.localdomain podman[110511]: 2026-02-23 09:01:45.481677391 +0000 UTC m=+0.094063119 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: No such file or directory
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory
Feb 23 09:01:45 np0005626463.localdomain podman[110520]: 2026-02-23 09:01:45.501817317 +0000 UTC m=+0.069066300 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git)
Feb 23 09:01:45 np0005626463.localdomain podman[110520]: metrics_qdr
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: Stopped metrics_qdr container.
Feb 23 09:01:45 np0005626463.localdomain sudo[110433]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:45 np0005626463.localdomain podman[110511]: 2026-02-23 09:01:45.88532933 +0000 UTC m=+0.497714998 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 09:01:45 np0005626463.localdomain sudo[110634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gigxgluodxljdgtnpxitwpiqpbdxhycr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837305.6399996-108-231233252991027/AnsiballZ_systemd_service.py
Feb 23 09:01:45 np0005626463.localdomain sudo[110634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:45 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 09:01:46 np0005626463.localdomain python3.9[110636]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:46 np0005626463.localdomain sudo[110634]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:46 np0005626463.localdomain sudo[110727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whkwwgcvxpbzzsokxdvcudfxziwpjubw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837306.3680382-108-85826261908965/AnsiballZ_systemd_service.py
Feb 23 09:01:46 np0005626463.localdomain sudo[110727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:46 np0005626463.localdomain python3.9[110729]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:48 np0005626463.localdomain sudo[110727]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 23 09:01:48 np0005626463.localdomain sudo[110820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlaxsdhqneexecmtnkonkgxeywpxklzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837308.1479945-108-262976271021052/AnsiballZ_systemd_service.py
Feb 23 09:01:48 np0005626463.localdomain sudo[110820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:48 np0005626463.localdomain python3.9[110822]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:48 np0005626463.localdomain sudo[110820]: pam_unix(sudo:session): session closed for user root
Feb 23 09:01:49 np0005626463.localdomain sudo[110913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbsbuavbborlllbmtiqezcfdbssxxzrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837308.8835797-108-143478373016556/AnsiballZ_systemd_service.py
Feb 23 09:01:49 np0005626463.localdomain sudo[110913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:01:49 np0005626463.localdomain python3.9[110915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:01:49 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:01:49 np0005626463.localdomain systemd-sysv-generator[110945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:01:49 np0005626463.localdomain systemd-rc-local-generator[110938]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:01:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:01:49 np0005626463.localdomain systemd[1]: Stopping nova_compute container...
Feb 23 09:01:50 np0005626463.localdomain sshd[110967]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:01:51 np0005626463.localdomain sshd[110967]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:01:53 np0005626463.localdomain sshd[110969]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:01:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54195 DF PROTO=TCP SPT=42396 DPT=9102 SEQ=975616764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB32300000000001030307) 
Feb 23 09:01:54 np0005626463.localdomain sshd[110969]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:01:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2003 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB34060000000001030307) 
Feb 23 09:01:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54197 DF PROTO=TCP SPT=42396 DPT=9102 SEQ=975616764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB3E470000000001030307) 
Feb 23 09:02:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5157 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB4CC60000000001030307) 
Feb 23 09:02:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5158 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB54C60000000001030307) 
Feb 23 09:02:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59675 DF PROTO=TCP SPT=47758 DPT=9101 SEQ=3553128173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB61870000000001030307) 
Feb 23 09:02:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9885 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB6DE30000000001030307) 
Feb 23 09:02:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9887 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB7A060000000001030307) 
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:02:12 np0005626463.localdomain podman[110972]: 2026-02-23 09:02:12.710279664 +0000 UTC m=+0.135426454 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, version=17.1.13, architecture=x86_64)
Feb 23 09:02:12 np0005626463.localdomain podman[110972]: 2026-02-23 09:02:12.726566481 +0000 UTC m=+0.151713321 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, vendor=Red Hat, Inc.)
Feb 23 09:02:12 np0005626463.localdomain podman[110972]: unhealthy
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:02:12 np0005626463.localdomain podman[110990]: 2026-02-23 09:02:12.817892693 +0000 UTC m=+0.099384534 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true)
Feb 23 09:02:12 np0005626463.localdomain podman[110991]: Error: container c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 is not running
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=125/n/a
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'.
Feb 23 09:02:12 np0005626463.localdomain podman[110990]: 2026-02-23 09:02:12.861345345 +0000 UTC m=+0.142837196 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 23 09:02:12 np0005626463.localdomain podman[110990]: unhealthy
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:02:12 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:02:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5160 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB84060000000001030307) 
Feb 23 09:02:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:02:16 np0005626463.localdomain podman[111026]: 2026-02-23 09:02:16.390701491 +0000 UTC m=+0.071143664 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 23 09:02:16 np0005626463.localdomain podman[111026]: 2026-02-23 09:02:16.781290065 +0000 UTC m=+0.461732238 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 09:02:16 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully.
Feb 23 09:02:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59677 DF PROTO=TCP SPT=47758 DPT=9101 SEQ=3553128173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB92060000000001030307) 
Feb 23 09:02:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18675 DF PROTO=TCP SPT=36840 DPT=9102 SEQ=2096385036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBA7600000000001030307) 
Feb 23 09:02:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9889 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBAA070000000001030307) 
Feb 23 09:02:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18677 DF PROTO=TCP SPT=36840 DPT=9102 SEQ=2096385036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBB3870000000001030307) 
Feb 23 09:02:27 np0005626463.localdomain sudo[111049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:02:27 np0005626463.localdomain sudo[111049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:02:27 np0005626463.localdomain sudo[111049]: pam_unix(sudo:session): session closed for user root
Feb 23 09:02:27 np0005626463.localdomain sudo[111064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:02:28 np0005626463.localdomain sudo[111064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:02:28 np0005626463.localdomain sudo[111064]: pam_unix(sudo:session): session closed for user root
Feb 23 09:02:29 np0005626463.localdomain sudo[111111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:02:29 np0005626463.localdomain sudo[111111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:02:29 np0005626463.localdomain sudo[111111]: pam_unix(sudo:session): session closed for user root
Feb 23 09:02:30 np0005626463.localdomain sshd[111126]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:02:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39510 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBC1C60000000001030307) 
Feb 23 09:02:30 np0005626463.localdomain sshd[111126]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:02:32 np0005626463.localdomain podman[110955]: time="2026-02-23T09:02:32Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: libpod-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: libpod-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Consumed 39.939s CPU time.
Feb 23 09:02:32 np0005626463.localdomain podman[110955]: 2026-02-23 09:02:32.067029038 +0000 UTC m=+42.109890836 container died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e-merged.mount: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain podman[110955]: 2026-02-23 09:02:32.137571653 +0000 UTC m=+42.180433371 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 23 09:02:32 np0005626463.localdomain podman[110955]: nova_compute
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: No such file or directory
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory
Feb 23 09:02:32 np0005626463.localdomain podman[111129]: 2026-02-23 09:02:32.174534863 +0000 UTC m=+0.089215236 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: libpod-conmon-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: No such file or directory
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory
Feb 23 09:02:32 np0005626463.localdomain podman[111144]: 2026-02-23 09:02:32.288430997 +0000 UTC m=+0.080350882 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 23 09:02:32 np0005626463.localdomain podman[111144]: nova_compute
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: Stopped nova_compute container.
Feb 23 09:02:32 np0005626463.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.184s CPU time, no IO.
Feb 23 09:02:32 np0005626463.localdomain sudo[110913]: pam_unix(sudo:session): session closed for user root
Feb 23 09:02:32 np0005626463.localdomain sudo[111246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxioyhwahwtrbduqdvzokchwolzjlaev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837352.474678-108-74081270846167/AnsiballZ_systemd_service.py
Feb 23 09:02:32 np0005626463.localdomain sudo[111246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:02:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39511 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBC9E60000000001030307) 
Feb 23 09:02:33 np0005626463.localdomain python3.9[111248]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:02:33 np0005626463.localdomain systemd-rc-local-generator[111273]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:02:33 np0005626463.localdomain systemd-sysv-generator[111280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Stopping nova_migration_target container...
Feb 23 09:02:33 np0005626463.localdomain recover_tripleo_nova_virtqemud[111291]: 61982
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 09:02:33 np0005626463.localdomain sshd[71013]: Received signal 15; terminating.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: libpod-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: libpod-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Consumed 38.936s CPU time.
Feb 23 09:02:33 np0005626463.localdomain podman[111290]: 2026-02-23 09:02:33.666493505 +0000 UTC m=+0.081848127 container died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0-userdata-shm.mount: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4752b195a0319d00ad3a4bd86f4312afcec268e914950a9934c95f1e8044f1fa-merged.mount: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain podman[111290]: 2026-02-23 09:02:33.713187978 +0000 UTC m=+0.128542530 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=)
Feb 23 09:02:33 np0005626463.localdomain podman[111290]: nova_migration_target
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: No such file or directory
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory
Feb 23 09:02:33 np0005626463.localdomain podman[111303]: 2026-02-23 09:02:33.753399029 +0000 UTC m=+0.073442036 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: libpod-conmon-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: No such file or directory
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory
Feb 23 09:02:33 np0005626463.localdomain podman[111319]: 2026-02-23 09:02:33.862204945 +0000 UTC m=+0.077746620 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, container_name=nova_migration_target)
Feb 23 09:02:33 np0005626463.localdomain podman[111319]: nova_migration_target
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Feb 23 09:02:33 np0005626463.localdomain systemd[1]: Stopped nova_migration_target container.
Feb 23 09:02:33 np0005626463.localdomain sudo[111246]: pam_unix(sudo:session): session closed for user root
Feb 23 09:02:34 np0005626463.localdomain sudo[111421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwlyejmaidluvrvbjvugiqmuomayejri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837354.0547438-108-171484988958519/AnsiballZ_systemd_service.py
Feb 23 09:02:34 np0005626463.localdomain sudo[111421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:02:34 np0005626463.localdomain python3.9[111423]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:02:34 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:02:34 np0005626463.localdomain systemd-sysv-generator[111454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:02:34 np0005626463.localdomain systemd-rc-local-generator[111448]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:02:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:02:35 np0005626463.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Feb 23 09:02:35 np0005626463.localdomain systemd[1]: libpod-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope: Deactivated successfully.
Feb 23 09:02:35 np0005626463.localdomain podman[111464]: 2026-02-23 09:02:35.146176425 +0000 UTC m=+0.064858109 container died 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:02:35 np0005626463.localdomain systemd[1]: tmp-crun.h6pGis.mount: Deactivated successfully.
Feb 23 09:02:35 np0005626463.localdomain podman[111464]: 2026-02-23 09:02:35.194470857 +0000 UTC m=+0.113152131 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, container_name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3)
Feb 23 09:02:35 np0005626463.localdomain podman[111464]: nova_virtlogd_wrapper
Feb 23 09:02:35 np0005626463.localdomain podman[111477]: 2026-02-23 09:02:35.226113733 +0000 UTC m=+0.071469176 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, container_name=nova_virtlogd_wrapper, version=17.1.13)
Feb 23 09:02:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab-merged.mount: Deactivated successfully.
Feb 23 09:02:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b-userdata-shm.mount: Deactivated successfully.
Feb 23 09:02:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53887 DF PROTO=TCP SPT=35042 DPT=9101 SEQ=4242459794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBD6C60000000001030307) 
Feb 23 09:02:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43523 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBE3130000000001030307) 
Feb 23 09:02:41 np0005626463.localdomain sshd[111495]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Activating special unit Exit the Session...
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Removed slice User Background Tasks Slice.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped target Main User Target.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped target Basic System.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped target Paths.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped target Sockets.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped target Timers.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Closed D-Bus User Message Bus Socket.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Stopped Create User's Volatile Files and Directories.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Removed slice User Application Slice.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Reached target Shutdown.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Finished Exit the Session.
Feb 23 09:02:42 np0005626463.localdomain systemd[83969]: Reached target Exit the Session.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: user@0.service: Consumed 4.286s CPU time, no IO.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: user-0.slice: Consumed 5.290s CPU time.
Feb 23 09:02:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43525 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBEF060000000001030307) 
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:02:42 np0005626463.localdomain sshd[111495]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: tmp-crun.asB6Pc.mount: Deactivated successfully.
Feb 23 09:02:42 np0005626463.localdomain podman[111498]: 2026-02-23 09:02:42.958219546 +0000 UTC m=+0.108008414 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Feb 23 09:02:42 np0005626463.localdomain podman[111498]: 2026-02-23 09:02:42.980146393 +0000 UTC m=+0.129935291 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 09:02:42 np0005626463.localdomain podman[111498]: unhealthy
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:02:42 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:02:43 np0005626463.localdomain systemd[1]: tmp-crun.IJXuTJ.mount: Deactivated successfully.
Feb 23 09:02:43 np0005626463.localdomain podman[111516]: 2026-02-23 09:02:43.058166553 +0000 UTC m=+0.094228014 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 23 09:02:43 np0005626463.localdomain podman[111516]: 2026-02-23 09:02:43.077188391 +0000 UTC m=+0.113249882 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:02:43 np0005626463.localdomain podman[111516]: unhealthy
Feb 23 09:02:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:02:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:02:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39513 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBFA060000000001030307) 
Feb 23 09:02:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53889 DF PROTO=TCP SPT=35042 DPT=9101 SEQ=4242459794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC06060000000001030307) 
Feb 23 09:02:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7561 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC1C900000000001030307) 
Feb 23 09:02:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43527 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC20060000000001030307) 
Feb 23 09:02:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7563 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC28860000000001030307) 
Feb 23 09:03:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46947 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=2996360404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC37060000000001030307) 
Feb 23 09:03:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46948 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=2996360404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC3F060000000001030307) 
Feb 23 09:03:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43209 DF PROTO=TCP SPT=45530 DPT=9101 SEQ=900839925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC4C060000000001030307) 
Feb 23 09:03:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7565 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC58060000000001030307) 
Feb 23 09:03:10 np0005626463.localdomain sshd[111539]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:03:11 np0005626463.localdomain sshd[111539]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:03:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45126 DF PROTO=TCP SPT=46666 DPT=9882 SEQ=3373547326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC64470000000001030307) 
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:03:13 np0005626463.localdomain podman[111541]: 2026-02-23 09:03:13.156435408 +0000 UTC m=+0.081655943 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:03:13 np0005626463.localdomain podman[111541]: 2026-02-23 09:03:13.176351014 +0000 UTC m=+0.101571589 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 23 09:03:13 np0005626463.localdomain podman[111541]: unhealthy
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:03:13 np0005626463.localdomain podman[111560]: 2026-02-23 09:03:13.253047125 +0000 UTC m=+0.084446668 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=)
Feb 23 09:03:13 np0005626463.localdomain podman[111560]: 2026-02-23 09:03:13.294267038 +0000 UTC m=+0.125666571 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Feb 23 09:03:13 np0005626463.localdomain podman[111560]: unhealthy
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:03:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:03:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29707 DF PROTO=TCP SPT=44550 DPT=9105 SEQ=3695610875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC6E060000000001030307) 
Feb 23 09:03:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43211 DF PROTO=TCP SPT=45530 DPT=9101 SEQ=900839925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC7C070000000001030307) 
Feb 23 09:03:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39298 DF PROTO=TCP SPT=36766 DPT=9102 SEQ=969065005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC91C00000000001030307) 
Feb 23 09:03:24 np0005626463.localdomain sshd[111582]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:03:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45128 DF PROTO=TCP SPT=46666 DPT=9882 SEQ=3373547326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC94070000000001030307) 
Feb 23 09:03:25 np0005626463.localdomain sshd[111582]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:03:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39300 DF PROTO=TCP SPT=36766 DPT=9102 SEQ=969065005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC9DC60000000001030307) 
Feb 23 09:03:29 np0005626463.localdomain sudo[111584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:03:29 np0005626463.localdomain sudo[111584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:03:29 np0005626463.localdomain sudo[111584]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:29 np0005626463.localdomain sudo[111599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:03:29 np0005626463.localdomain sudo[111599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:03:30 np0005626463.localdomain podman[111687]: 2026-02-23 09:03:30.444127607 +0000 UTC m=+0.096610537 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:03:30 np0005626463.localdomain podman[111687]: 2026-02-23 09:03:30.521159358 +0000 UTC m=+0.173642338 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, RELEASE=main, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:03:30 np0005626463.localdomain sudo[111599]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14039 DF PROTO=TCP SPT=60982 DPT=9100 SEQ=1058392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCAC460000000001030307) 
Feb 23 09:03:32 np0005626463.localdomain sudo[111753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:03:32 np0005626463.localdomain sudo[111753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:03:32 np0005626463.localdomain sudo[111753]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:32 np0005626463.localdomain sudo[111768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:03:32 np0005626463.localdomain sudo[111768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:03:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14040 DF PROTO=TCP SPT=60982 DPT=9100 SEQ=1058392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCB4460000000001030307) 
Feb 23 09:03:33 np0005626463.localdomain sudo[111768]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:34 np0005626463.localdomain sudo[111814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:03:34 np0005626463.localdomain sudo[111814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:03:34 np0005626463.localdomain sudo[111814]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21755 DF PROTO=TCP SPT=47746 DPT=9101 SEQ=3930008034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCC1060000000001030307) 
Feb 23 09:03:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36762 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCCD730000000001030307) 
Feb 23 09:03:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36764 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCD9860000000001030307) 
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:03:43 np0005626463.localdomain podman[111830]: 2026-02-23 09:03:43.4161829 +0000 UTC m=+0.085122787 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4)
Feb 23 09:03:43 np0005626463.localdomain podman[111830]: 2026-02-23 09:03:43.431207397 +0000 UTC m=+0.100147314 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 09:03:43 np0005626463.localdomain podman[111830]: unhealthy
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:03:43 np0005626463.localdomain podman[111829]: 2026-02-23 09:03:43.515627223 +0000 UTC m=+0.187340295 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 23 09:03:43 np0005626463.localdomain podman[111829]: 2026-02-23 09:03:43.533019741 +0000 UTC m=+0.204732783 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 23 09:03:43 np0005626463.localdomain podman[111829]: unhealthy
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:03:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:03:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3698 DF PROTO=TCP SPT=38786 DPT=9105 SEQ=4223480551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCE4060000000001030307) 
Feb 23 09:03:46 np0005626463.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 23 09:03:46 np0005626463.localdomain recover_tripleo_nova_virtqemud[111869]: 61982
Feb 23 09:03:46 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 23 09:03:46 np0005626463.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 23 09:03:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21757 DF PROTO=TCP SPT=47746 DPT=9101 SEQ=3930008034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCF2070000000001030307) 
Feb 23 09:03:51 np0005626463.localdomain sshd[111870]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:03:52 np0005626463.localdomain sshd[111870]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:03:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17609 DF PROTO=TCP SPT=50946 DPT=9102 SEQ=2500752756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD06F00000000001030307) 
Feb 23 09:03:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36766 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD0A060000000001030307) 
Feb 23 09:03:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17611 DF PROTO=TCP SPT=50946 DPT=9102 SEQ=2500752756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD13060000000001030307) 
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61197 (conmon) with signal SIGKILL.
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: libpod-conmon-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope: Deactivated successfully.
Feb 23 09:03:59 np0005626463.localdomain podman[111883]: error opening file `/run/crun/215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b/status`: No such file or directory
Feb 23 09:03:59 np0005626463.localdomain podman[111872]: 2026-02-23 09:03:59.403571792 +0000 UTC m=+0.074187355 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 09:03:59 np0005626463.localdomain podman[111872]: nova_virtlogd_wrapper
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Feb 23 09:03:59 np0005626463.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Feb 23 09:03:59 np0005626463.localdomain sudo[111421]: pam_unix(sudo:session): session closed for user root
Feb 23 09:03:59 np0005626463.localdomain sudo[111974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krkbalhginomwnjnepaqlqiwurkxxhby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837439.566548-108-29599340761515/AnsiballZ_systemd_service.py
Feb 23 09:03:59 np0005626463.localdomain sudo[111974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:04:00 np0005626463.localdomain python3.9[111976]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:04:00 np0005626463.localdomain systemd-rc-local-generator[112004]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:04:00 np0005626463.localdomain systemd-sysv-generator[112009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: libpod-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Deactivated successfully.
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: libpod-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Consumed 1.792s CPU time.
Feb 23 09:04:00 np0005626463.localdomain podman[112017]: 2026-02-23 09:04:00.678908281 +0000 UTC m=+0.103532478 container died 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 23 09:04:00 np0005626463.localdomain podman[112017]: 2026-02-23 09:04:00.715939907 +0000 UTC m=+0.140564044 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 09:04:00 np0005626463.localdomain podman[112017]: nova_virtnodedevd
Feb 23 09:04:00 np0005626463.localdomain podman[112031]: 2026-02-23 09:04:00.774857428 +0000 UTC m=+0.079409345 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=nova_virtnodedevd, name=rhosp-rhel9/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: libpod-conmon-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Deactivated successfully.
Feb 23 09:04:00 np0005626463.localdomain podman[112059]: error opening file `/run/crun/930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2/status`: No such file or directory
Feb 23 09:04:00 np0005626463.localdomain podman[112048]: 2026-02-23 09:04:00.888598414 +0000 UTC m=+0.074566417 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible)
Feb 23 09:04:00 np0005626463.localdomain podman[112048]: nova_virtnodedevd
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Feb 23 09:04:00 np0005626463.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Feb 23 09:04:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62767 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD21870000000001030307) 
Feb 23 09:04:00 np0005626463.localdomain sudo[111974]: pam_unix(sudo:session): session closed for user root
Feb 23 09:04:01 np0005626463.localdomain sudo[112150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qznnmfyrdkukkjzgcfcvnseifqhryzsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837441.0578556-108-31902362225147/AnsiballZ_systemd_service.py
Feb 23 09:04:01 np0005626463.localdomain sudo[112150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:04:01 np0005626463.localdomain python3.9[112152]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:04:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7-merged.mount: Deactivated successfully.
Feb 23 09:04:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2-userdata-shm.mount: Deactivated successfully.
Feb 23 09:04:01 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:04:01 np0005626463.localdomain systemd-rc-local-generator[112178]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:04:01 np0005626463.localdomain systemd-sysv-generator[112183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:04:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: Stopping nova_virtproxyd container...
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: tmp-crun.OJD2jf.mount: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: libpod-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain podman[112193]: 2026-02-23 09:04:02.159946181 +0000 UTC m=+0.082770167 container died 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container)
Feb 23 09:04:02 np0005626463.localdomain podman[112193]: 2026-02-23 09:04:02.195573264 +0000 UTC m=+0.118397230 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=nova_virtproxyd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 09:04:02 np0005626463.localdomain podman[112193]: nova_virtproxyd
Feb 23 09:04:02 np0005626463.localdomain podman[112208]: 2026-02-23 09:04:02.247011737 +0000 UTC m=+0.073522945 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtproxyd)
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: libpod-conmon-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain podman[112238]: error opening file `/run/crun/33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a/status`: No such file or directory
Feb 23 09:04:02 np0005626463.localdomain podman[112226]: 2026-02-23 09:04:02.355824224 +0000 UTC m=+0.073340379 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, container_name=nova_virtproxyd, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 09:04:02 np0005626463.localdomain podman[112226]: nova_virtproxyd
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: Stopped nova_virtproxyd container.
Feb 23 09:04:02 np0005626463.localdomain sudo[112150]: pam_unix(sudo:session): session closed for user root
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3-merged.mount: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a-userdata-shm.mount: Deactivated successfully.
Feb 23 09:04:02 np0005626463.localdomain sudo[112329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhtxupufouhhormniibavlhhhqarcqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837442.5323613-108-240830097474300/AnsiballZ_systemd_service.py
Feb 23 09:04:02 np0005626463.localdomain sudo[112329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:04:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62768 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD29860000000001030307) 
Feb 23 09:04:03 np0005626463.localdomain python3.9[112331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:04:03 np0005626463.localdomain systemd-rc-local-generator[112356]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:04:03 np0005626463.localdomain systemd-sysv-generator[112361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: Stopping nova_virtqemud container...
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: libpod-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Deactivated successfully.
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: libpod-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Consumed 3.232s CPU time.
Feb 23 09:04:03 np0005626463.localdomain podman[112372]: 2026-02-23 09:04:03.678057238 +0000 UTC m=+0.080160967 container stop ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2026-01-12T23:31:49Z, container_name=nova_virtqemud, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 09:04:03 np0005626463.localdomain podman[112372]: 2026-02-23 09:04:03.709178484 +0000 UTC m=+0.111282183 container died ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2026-01-12T23:31:49Z, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public)
Feb 23 09:04:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468-userdata-shm.mount: Deactivated successfully.
Feb 23 09:04:03 np0005626463.localdomain podman[112372]: 2026-02-23 09:04:03.740189376 +0000 UTC m=+0.142293075 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, container_name=nova_virtqemud, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:31:49Z, version=17.1.13)
Feb 23 09:04:03 np0005626463.localdomain podman[112372]: nova_virtqemud
Feb 23 09:04:03 np0005626463.localdomain podman[112386]: 2026-02-23 09:04:03.778328936 +0000 UTC m=+0.084063976 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, container_name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 23 09:04:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff-merged.mount: Deactivated successfully.
Feb 23 09:04:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35724 DF PROTO=TCP SPT=34998 DPT=9101 SEQ=1217985343 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD36470000000001030307) 
Feb 23 09:04:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52401 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD42A30000000001030307) 
Feb 23 09:04:11 np0005626463.localdomain sshd[112403]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:04:12 np0005626463.localdomain sshd[112403]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:04:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52403 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD4EC70000000001030307) 
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: tmp-crun.KvVvps.mount: Deactivated successfully.
Feb 23 09:04:13 np0005626463.localdomain podman[112406]: 2026-02-23 09:04:13.730964565 +0000 UTC m=+0.151726892 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 23 09:04:13 np0005626463.localdomain podman[112406]: 2026-02-23 09:04:13.748792558 +0000 UTC m=+0.169554845 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5)
Feb 23 09:04:13 np0005626463.localdomain podman[112406]: unhealthy
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:04:13 np0005626463.localdomain podman[112405]: 2026-02-23 09:04:13.684844794 +0000 UTC m=+0.106400935 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 23 09:04:13 np0005626463.localdomain podman[112405]: 2026-02-23 09:04:13.820364162 +0000 UTC m=+0.241920293 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:04:13 np0005626463.localdomain podman[112405]: unhealthy
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:04:13 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:04:14 np0005626463.localdomain systemd[1]: tmp-crun.x8qa7a.mount: Deactivated successfully.
Feb 23 09:04:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62770 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD5A060000000001030307) 
Feb 23 09:04:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35726 DF PROTO=TCP SPT=34998 DPT=9101 SEQ=1217985343 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD66060000000001030307) 
Feb 23 09:04:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51737 DF PROTO=TCP SPT=36170 DPT=9102 SEQ=3860819404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD7C200000000001030307) 
Feb 23 09:04:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52405 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD7E070000000001030307) 
Feb 23 09:04:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51739 DF PROTO=TCP SPT=36170 DPT=9102 SEQ=3860819404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD88460000000001030307) 
Feb 23 09:04:30 np0005626463.localdomain sshd[112446]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:04:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21407 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD96860000000001030307) 
Feb 23 09:04:31 np0005626463.localdomain sshd[112446]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:04:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21408 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD9E860000000001030307) 
Feb 23 09:04:34 np0005626463.localdomain sudo[112448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:04:34 np0005626463.localdomain sudo[112448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:04:34 np0005626463.localdomain sudo[112448]: pam_unix(sudo:session): session closed for user root
Feb 23 09:04:34 np0005626463.localdomain sudo[112463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:04:34 np0005626463.localdomain sudo[112463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:04:35 np0005626463.localdomain sudo[112463]: pam_unix(sudo:session): session closed for user root
Feb 23 09:04:36 np0005626463.localdomain sudo[112510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:04:36 np0005626463.localdomain sudo[112510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:04:36 np0005626463.localdomain sudo[112510]: pam_unix(sudo:session): session closed for user root
Feb 23 09:04:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62864 DF PROTO=TCP SPT=33276 DPT=9101 SEQ=3462267381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDAB870000000001030307) 
Feb 23 09:04:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55635 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDB7D30000000001030307) 
Feb 23 09:04:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55637 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDC3C70000000001030307) 
Feb 23 09:04:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:04:43 np0005626463.localdomain systemd[1]: tmp-crun.e7whMF.mount: Deactivated successfully.
Feb 23 09:04:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:04:43 np0005626463.localdomain podman[112525]: 2026-02-23 09:04:43.944899666 +0000 UTC m=+0.115046977 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 23 09:04:43 np0005626463.localdomain podman[112525]: 2026-02-23 09:04:43.96113687 +0000 UTC m=+0.131284181 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:04:43 np0005626463.localdomain podman[112525]: unhealthy
Feb 23 09:04:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:04:43 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:04:44 np0005626463.localdomain podman[112541]: 2026-02-23 09:04:44.035982633 +0000 UTC m=+0.079306220 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true)
Feb 23 09:04:44 np0005626463.localdomain podman[112541]: 2026-02-23 09:04:44.057291811 +0000 UTC m=+0.100615388 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc.)
Feb 23 09:04:44 np0005626463.localdomain podman[112541]: unhealthy
Feb 23 09:04:44 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:04:44 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:04:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21410 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDCE060000000001030307) 
Feb 23 09:04:48 np0005626463.localdomain sshd[112563]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:04:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62866 DF PROTO=TCP SPT=33276 DPT=9101 SEQ=3462267381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDDC060000000001030307) 
Feb 23 09:04:49 np0005626463.localdomain sshd[112563]: Invalid user anonymous from 80.94.95.115 port 60072
Feb 23 09:04:50 np0005626463.localdomain sshd[112563]: Connection closed by invalid user anonymous 80.94.95.115 port 60072 [preauth]
Feb 23 09:04:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5031 DF PROTO=TCP SPT=51082 DPT=9102 SEQ=4270415838 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDF1500000000001030307) 
Feb 23 09:04:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55639 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDF4060000000001030307) 
Feb 23 09:04:54 np0005626463.localdomain sshd[112565]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:04:55 np0005626463.localdomain sshd[112565]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:04:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5033 DF PROTO=TCP SPT=51082 DPT=9102 SEQ=4270415838 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDFD460000000001030307) 
Feb 23 09:05:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23663 DF PROTO=TCP SPT=38396 DPT=9100 SEQ=343615187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE0BC60000000001030307) 
Feb 23 09:05:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23664 DF PROTO=TCP SPT=38396 DPT=9100 SEQ=343615187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE13C60000000001030307) 
Feb 23 09:05:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14849 DF PROTO=TCP SPT=39344 DPT=9101 SEQ=3084715503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE20C60000000001030307) 
Feb 23 09:05:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2410 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE2D030000000001030307) 
Feb 23 09:05:11 np0005626463.localdomain sshd[112567]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:05:11 np0005626463.localdomain sshd[112567]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:05:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2412 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE39060000000001030307) 
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:05:14 np0005626463.localdomain podman[112569]: 2026-02-23 09:05:14.160558619 +0000 UTC m=+0.085859161 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, release=1766032510)
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:05:14 np0005626463.localdomain podman[112569]: 2026-02-23 09:05:14.179409181 +0000 UTC m=+0.104709743 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4)
Feb 23 09:05:14 np0005626463.localdomain podman[112569]: unhealthy
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'.
Feb 23 09:05:14 np0005626463.localdomain podman[112589]: 2026-02-23 09:05:14.258821705 +0000 UTC m=+0.075174276 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5)
Feb 23 09:05:14 np0005626463.localdomain podman[112589]: 2026-02-23 09:05:14.299714087 +0000 UTC m=+0.116066668 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 09:05:14 np0005626463.localdomain podman[112589]: unhealthy
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:05:14 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'.
Feb 23 09:05:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10254 DF PROTO=TCP SPT=60970 DPT=9105 SEQ=4164321857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE44060000000001030307) 
Feb 23 09:05:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14851 DF PROTO=TCP SPT=39344 DPT=9101 SEQ=3084715503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE50060000000001030307) 
Feb 23 09:05:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42740 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE66800000000001030307) 
Feb 23 09:05:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2414 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE6A060000000001030307) 
Feb 23 09:05:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42742 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE72860000000001030307) 
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 61978 (conmon) with signal SIGKILL.
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: libpod-conmon-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Deactivated successfully.
Feb 23 09:05:27 np0005626463.localdomain podman[112620]: error opening file `/run/crun/ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468/status`: No such file or directory
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: tmp-crun.EtWDjp.mount: Deactivated successfully.
Feb 23 09:05:27 np0005626463.localdomain podman[112609]: 2026-02-23 09:05:27.910698368 +0000 UTC m=+0.083325413 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 23 09:05:27 np0005626463.localdomain podman[112609]: nova_virtqemud
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Feb 23 09:05:27 np0005626463.localdomain systemd[1]: Stopped nova_virtqemud container.
Feb 23 09:05:27 np0005626463.localdomain sudo[112329]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:28 np0005626463.localdomain sudo[112711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlhyeifrrwqywqzwvmiwfnffwfuvqica ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837528.0742595-108-255002812331412/AnsiballZ_systemd_service.py
Feb 23 09:05:28 np0005626463.localdomain sudo[112711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:05:28 np0005626463.localdomain python3.9[112713]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:05:28 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:05:28 np0005626463.localdomain systemd-rc-local-generator[112736]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:05:28 np0005626463.localdomain systemd-sysv-generator[112743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:05:28 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:05:29 np0005626463.localdomain sudo[112711]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:29 np0005626463.localdomain sudo[112841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqqmllsslgrtvkjqdgmfglopqlfglvta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837529.1771507-108-182669345944640/AnsiballZ_systemd_service.py
Feb 23 09:05:29 np0005626463.localdomain sudo[112841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:05:29 np0005626463.localdomain python3.9[112843]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:05:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:05:29 np0005626463.localdomain systemd-rc-local-generator[112866]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:05:29 np0005626463.localdomain systemd-sysv-generator[112870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:05:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:05:30 np0005626463.localdomain systemd[1]: Stopping nova_virtsecretd container...
Feb 23 09:05:30 np0005626463.localdomain systemd[1]: libpod-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope: Deactivated successfully.
Feb 23 09:05:30 np0005626463.localdomain podman[112883]: 2026-02-23 09:05:30.213219754 +0000 UTC m=+0.064081658 container died c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, container_name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible)
Feb 23 09:05:30 np0005626463.localdomain podman[112883]: 2026-02-23 09:05:30.263111231 +0000 UTC m=+0.113973125 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 23 09:05:30 np0005626463.localdomain podman[112883]: nova_virtsecretd
Feb 23 09:05:30 np0005626463.localdomain podman[112897]: 2026-02-23 09:05:30.284132409 +0000 UTC m=+0.059720405 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 23 09:05:30 np0005626463.localdomain systemd[1]: libpod-conmon-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope: Deactivated successfully.
Feb 23 09:05:30 np0005626463.localdomain podman[112926]: error opening file `/run/crun/c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f/status`: No such file or directory
Feb 23 09:05:30 np0005626463.localdomain podman[112914]: 2026-02-23 09:05:30.401002481 +0000 UTC m=+0.076642499 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_virtsecretd, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20260112.1)
Feb 23 09:05:30 np0005626463.localdomain podman[112914]: nova_virtsecretd
Feb 23 09:05:30 np0005626463.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Feb 23 09:05:30 np0005626463.localdomain systemd[1]: Stopped nova_virtsecretd container.
Feb 23 09:05:30 np0005626463.localdomain sudo[112841]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:30 np0005626463.localdomain sudo[113017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inmjjhsiwfixgczdvbwilscwowoejdpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837530.591813-108-100556558474418/AnsiballZ_systemd_service.py
Feb 23 09:05:30 np0005626463.localdomain sudo[113017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:05:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55035 DF PROTO=TCP SPT=59978 DPT=9100 SEQ=1291074508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE81060000000001030307) 
Feb 23 09:05:31 np0005626463.localdomain python3.9[113019]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1-merged.mount: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f-userdata-shm.mount: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:05:31 np0005626463.localdomain systemd-sysv-generator[113052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:05:31 np0005626463.localdomain systemd-rc-local-generator[113047]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: Stopping nova_virtstoraged container...
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: libpod-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain podman[113060]: 2026-02-23 09:05:31.658293752 +0000 UTC m=+0.084386036 container died 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, container_name=nova_virtstoraged)
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: tmp-crun.awQY0i.mount: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain podman[113060]: 2026-02-23 09:05:31.708144237 +0000 UTC m=+0.134236491 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_virtstoraged, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team)
Feb 23 09:05:31 np0005626463.localdomain podman[113060]: nova_virtstoraged
Feb 23 09:05:31 np0005626463.localdomain podman[113076]: 2026-02-23 09:05:31.756638971 +0000 UTC m=+0.086975475 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, version=17.1.13, container_name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: libpod-conmon-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain podman[113105]: error opening file `/run/crun/5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843/status`: No such file or directory
Feb 23 09:05:31 np0005626463.localdomain podman[113093]: 2026-02-23 09:05:31.862479937 +0000 UTC m=+0.073775613 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:05:31 np0005626463.localdomain podman[113093]: nova_virtstoraged
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Feb 23 09:05:31 np0005626463.localdomain systemd[1]: Stopped nova_virtstoraged container.
Feb 23 09:05:31 np0005626463.localdomain sudo[113017]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460-merged.mount: Deactivated successfully.
Feb 23 09:05:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843-userdata-shm.mount: Deactivated successfully.
Feb 23 09:05:32 np0005626463.localdomain sudo[113196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiscqnddxetysvbazlamfdvdapkovbww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837532.0387225-108-243331977702095/AnsiballZ_systemd_service.py
Feb 23 09:05:32 np0005626463.localdomain sudo[113196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:05:32 np0005626463.localdomain python3.9[113198]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:05:32 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:05:32 np0005626463.localdomain systemd-rc-local-generator[113221]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:05:32 np0005626463.localdomain systemd-sysv-generator[113226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:05:32 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:05:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55036 DF PROTO=TCP SPT=59978 DPT=9100 SEQ=1291074508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE89060000000001030307) 
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: Stopping ovn_controller container...
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: tmp-crun.v0iYv9.mount: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: libpod-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: libpod-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Consumed 3.367s CPU time.
Feb 23 09:05:33 np0005626463.localdomain podman[113239]: 2026-02-23 09:05:33.152157511 +0000 UTC m=+0.121616956 container died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e-userdata-shm.mount: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764-merged.mount: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain podman[113239]: 2026-02-23 09:05:33.204911555 +0000 UTC m=+0.174370980 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 23 09:05:33 np0005626463.localdomain podman[113239]: ovn_controller
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: No such file or directory
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory
Feb 23 09:05:33 np0005626463.localdomain podman[113252]: 2026-02-23 09:05:33.242470356 +0000 UTC m=+0.077771634 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: libpod-conmon-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: No such file or directory
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory
Feb 23 09:05:33 np0005626463.localdomain podman[113266]: 2026-02-23 09:05:33.345224019 +0000 UTC m=+0.071963148 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 23 09:05:33 np0005626463.localdomain podman[113266]: ovn_controller
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Feb 23 09:05:33 np0005626463.localdomain systemd[1]: Stopped ovn_controller container.
Feb 23 09:05:33 np0005626463.localdomain sudo[113196]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:33 np0005626463.localdomain sudo[113369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maymylsmhbitchxyoobxqkvvncntlmnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837533.52766-108-107221331679500/AnsiballZ_systemd_service.py
Feb 23 09:05:33 np0005626463.localdomain sudo[113369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:05:34 np0005626463.localdomain python3.9[113371]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:05:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:05:35 np0005626463.localdomain systemd-rc-local-generator[113395]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:05:35 np0005626463.localdomain systemd-sysv-generator[113403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:05:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:05:35 np0005626463.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Feb 23 09:05:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17008 DF PROTO=TCP SPT=40828 DPT=9101 SEQ=3887796689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE96060000000001030307) 
Feb 23 09:05:36 np0005626463.localdomain sudo[113424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:05:36 np0005626463.localdomain sudo[113424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:05:36 np0005626463.localdomain sudo[113424]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: libpod-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Deactivated successfully.
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: libpod-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Consumed 13.160s CPU time.
Feb 23 09:05:36 np0005626463.localdomain podman[113412]: 2026-02-23 09:05:36.515765045 +0000 UTC m=+0.924393054 container died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13)
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Deactivated successfully.
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory
Feb 23 09:05:36 np0005626463.localdomain sudo[113439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9-userdata-shm.mount: Deactivated successfully.
Feb 23 09:05:36 np0005626463.localdomain sudo[113439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:05:36 np0005626463.localdomain podman[113412]: 2026-02-23 09:05:36.648330493 +0000 UTC m=+1.056958442 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510)
Feb 23 09:05:36 np0005626463.localdomain podman[113412]: ovn_metadata_agent
Feb 23 09:05:36 np0005626463.localdomain sshd[113469]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4-merged.mount: Deactivated successfully.
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: No such file or directory
Feb 23 09:05:36 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory
Feb 23 09:05:36 np0005626463.localdomain podman[113450]: 2026-02-23 09:05:36.677795089 +0000 UTC m=+0.150162235 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 23 09:05:37 np0005626463.localdomain sudo[113439]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:37 np0005626463.localdomain sudo[113505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:05:37 np0005626463.localdomain sudo[113505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:05:37 np0005626463.localdomain sudo[113505]: pam_unix(sudo:session): session closed for user root
Feb 23 09:05:39 np0005626463.localdomain sshd[113469]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:05:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42744 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEA2060000000001030307) 
Feb 23 09:05:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33114 DF PROTO=TCP SPT=58308 DPT=9882 SEQ=1548170034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEAE470000000001030307) 
Feb 23 09:05:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31859 DF PROTO=TCP SPT=46122 DPT=9105 SEQ=4028789398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEB8060000000001030307) 
Feb 23 09:05:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17010 DF PROTO=TCP SPT=40828 DPT=9101 SEQ=3887796689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEC6060000000001030307) 
Feb 23 09:05:51 np0005626463.localdomain sshd[113520]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:05:51 np0005626463.localdomain sshd[113520]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:05:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8423 DF PROTO=TCP SPT=54902 DPT=9102 SEQ=2603747387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEDBB00000000001030307) 
Feb 23 09:05:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33116 DF PROTO=TCP SPT=58308 DPT=9882 SEQ=1548170034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEDE060000000001030307) 
Feb 23 09:05:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8425 DF PROTO=TCP SPT=54902 DPT=9102 SEQ=2603747387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEE7C60000000001030307) 
Feb 23 09:06:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30169 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEF6460000000001030307) 
Feb 23 09:06:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30170 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEFE460000000001030307) 
Feb 23 09:06:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18536 DF PROTO=TCP SPT=35264 DPT=9101 SEQ=1628879478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF0B060000000001030307) 
Feb 23 09:06:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2540 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF17630000000001030307) 
Feb 23 09:06:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2542 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF23860000000001030307) 
Feb 23 09:06:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30172 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF2E060000000001030307) 
Feb 23 09:06:15 np0005626463.localdomain sshd[113522]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:06:16 np0005626463.localdomain sshd[113522]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:06:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18538 DF PROTO=TCP SPT=35264 DPT=9101 SEQ=1628879478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF3C060000000001030307) 
Feb 23 09:06:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10826 DF PROTO=TCP SPT=56876 DPT=9102 SEQ=4214466874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF50E20000000001030307) 
Feb 23 09:06:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2544 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF54070000000001030307) 
Feb 23 09:06:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10828 DF PROTO=TCP SPT=56876 DPT=9102 SEQ=4214466874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF5D060000000001030307) 
Feb 23 09:06:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21261 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF6B460000000001030307) 
Feb 23 09:06:31 np0005626463.localdomain sshd[113524]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:06:31 np0005626463.localdomain sshd[113524]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:06:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21262 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF73460000000001030307) 
Feb 23 09:06:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16432 DF PROTO=TCP SPT=38864 DPT=9101 SEQ=1068486902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF80460000000001030307) 
Feb 23 09:06:37 np0005626463.localdomain sudo[113526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:06:37 np0005626463.localdomain sudo[113526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:06:37 np0005626463.localdomain sudo[113526]: pam_unix(sudo:session): session closed for user root
Feb 23 09:06:38 np0005626463.localdomain sudo[113541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:06:38 np0005626463.localdomain sudo[113541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:06:38 np0005626463.localdomain sudo[113541]: pam_unix(sudo:session): session closed for user root
Feb 23 09:06:39 np0005626463.localdomain sudo[113588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:06:39 np0005626463.localdomain sudo[113588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:06:39 np0005626463.localdomain sudo[113588]: pam_unix(sudo:session): session closed for user root
Feb 23 09:06:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58685 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF8C930000000001030307) 
Feb 23 09:06:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58687 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF98860000000001030307) 
Feb 23 09:06:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21264 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFA4060000000001030307) 
Feb 23 09:06:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16434 DF PROTO=TCP SPT=38864 DPT=9101 SEQ=1068486902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFB0060000000001030307) 
Feb 23 09:06:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23993 DF PROTO=TCP SPT=55840 DPT=9102 SEQ=1594204587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFC6100000000001030307) 
Feb 23 09:06:54 np0005626463.localdomain sshd[113603]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:06:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58689 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFC8060000000001030307) 
Feb 23 09:06:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23995 DF PROTO=TCP SPT=55840 DPT=9102 SEQ=1594204587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFD2060000000001030307) 
Feb 23 09:06:57 np0005626463.localdomain sshd[113603]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 71305 (conmon) with signal SIGKILL.
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: libpod-conmon-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Deactivated successfully.
Feb 23 09:07:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2972 DF PROTO=TCP SPT=37642 DPT=9100 SEQ=742417238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFE0860000000001030307) 
Feb 23 09:07:00 np0005626463.localdomain podman[113617]: error opening file `/run/crun/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9/status`: No such file or directory
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: No such file or directory
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory
Feb 23 09:07:00 np0005626463.localdomain podman[113605]: 2026-02-23 09:07:00.93436007 +0000 UTC m=+0.095711453 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Feb 23 09:07:00 np0005626463.localdomain podman[113605]: ovn_metadata_agent
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Feb 23 09:07:00 np0005626463.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Feb 23 09:07:00 np0005626463.localdomain sudo[113369]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:01 np0005626463.localdomain sudo[113708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqejdmxwhtwivbacipvfbtlmuajgtbja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837621.111524-108-169235815835858/AnsiballZ_systemd_service.py
Feb 23 09:07:01 np0005626463.localdomain sudo[113708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:01 np0005626463.localdomain python3.9[113710]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:07:01 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:07:01 np0005626463.localdomain systemd-sysv-generator[113738]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:07:01 np0005626463.localdomain systemd-rc-local-generator[113734]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:07:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:07:02 np0005626463.localdomain sudo[113708]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2973 DF PROTO=TCP SPT=37642 DPT=9100 SEQ=742417238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFE8860000000001030307) 
Feb 23 09:07:03 np0005626463.localdomain sudo[113838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uopsisyyzgvdueityprlpmkcpgkxcuam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837623.0137453-558-45523440732232/AnsiballZ_file.py
Feb 23 09:07:03 np0005626463.localdomain sudo[113838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:03 np0005626463.localdomain python3.9[113840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:03 np0005626463.localdomain sudo[113838]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:04 np0005626463.localdomain sudo[113930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-famdrbbmqexxlewfexrtqdluzifiiimo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837623.8178654-558-99197305834596/AnsiballZ_file.py
Feb 23 09:07:04 np0005626463.localdomain sudo[113930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:04 np0005626463.localdomain python3.9[113932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:04 np0005626463.localdomain sudo[113930]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:04 np0005626463.localdomain sudo[114022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-proiejmjpwzbrmpiybgomslqerondxxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837624.433213-558-229660893730599/AnsiballZ_file.py
Feb 23 09:07:04 np0005626463.localdomain sudo[114022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:04 np0005626463.localdomain python3.9[114024]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:04 np0005626463.localdomain sudo[114022]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:05 np0005626463.localdomain sudo[114114]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuqstqnzrjzhhkrxdpckurdelrwvmgsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837625.067863-558-153302474608577/AnsiballZ_file.py
Feb 23 09:07:05 np0005626463.localdomain sudo[114114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:05 np0005626463.localdomain python3.9[114116]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:05 np0005626463.localdomain sudo[114114]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:05 np0005626463.localdomain sudo[114206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqorvgmgvlvkzdieglthkutthkxbzgef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837625.6247928-558-255869446847009/AnsiballZ_file.py
Feb 23 09:07:05 np0005626463.localdomain sudo[114206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:06 np0005626463.localdomain python3.9[114208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:06 np0005626463.localdomain sudo[114206]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43779 DF PROTO=TCP SPT=34978 DPT=9101 SEQ=819520041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFF5870000000001030307) 
Feb 23 09:07:06 np0005626463.localdomain sudo[114298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myuuzfkmzddvdlcvpsnlnelzwanwfphr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837626.2266574-558-209250600946340/AnsiballZ_file.py
Feb 23 09:07:06 np0005626463.localdomain sudo[114298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:06 np0005626463.localdomain python3.9[114300]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:06 np0005626463.localdomain sudo[114298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:07 np0005626463.localdomain sudo[114390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsbhjynkclsxlnnmenowkdcmtgiviatx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837626.8547418-558-160750077262404/AnsiballZ_file.py
Feb 23 09:07:07 np0005626463.localdomain sudo[114390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:07 np0005626463.localdomain python3.9[114392]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:07 np0005626463.localdomain sudo[114390]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:07 np0005626463.localdomain sudo[114482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqtyjzcemynymhjhkegparcnonqskgul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837627.4577465-558-12105846748552/AnsiballZ_file.py
Feb 23 09:07:07 np0005626463.localdomain sudo[114482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:07 np0005626463.localdomain python3.9[114484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:07 np0005626463.localdomain sudo[114482]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:08 np0005626463.localdomain sudo[114574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrnjebmswrsbbfogbjrhxvkfidxwfisn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837628.0811257-558-87517784728090/AnsiballZ_file.py
Feb 23 09:07:08 np0005626463.localdomain sudo[114574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:08 np0005626463.localdomain python3.9[114576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:08 np0005626463.localdomain sudo[114574]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:08 np0005626463.localdomain sudo[114666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdwpxgwljktaxwngohfoeeaowmxdcvir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837628.672287-558-139193239936670/AnsiballZ_file.py
Feb 23 09:07:08 np0005626463.localdomain sudo[114666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:09 np0005626463.localdomain python3.9[114668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:09 np0005626463.localdomain sudo[114666]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19751 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE001C30000000001030307) 
Feb 23 09:07:09 np0005626463.localdomain sudo[114758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owkrfheghngcofpgwhiqlepvlbxajbov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837629.2916234-558-100360409834543/AnsiballZ_file.py
Feb 23 09:07:09 np0005626463.localdomain sudo[114758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:09 np0005626463.localdomain python3.9[114760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:09 np0005626463.localdomain sudo[114758]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:10 np0005626463.localdomain sshd[114852]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:07:10 np0005626463.localdomain sudo[114850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayhqynlqvrrimouimviocqtouhcibuzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837629.872631-558-184166170829896/AnsiballZ_file.py
Feb 23 09:07:10 np0005626463.localdomain sudo[114850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:10 np0005626463.localdomain python3.9[114854]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:10 np0005626463.localdomain sudo[114850]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:10 np0005626463.localdomain sshd[114852]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:07:10 np0005626463.localdomain sudo[114944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eykkfyojeikggyvwhprlxpwlasffaqkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837630.4807236-558-51723294019865/AnsiballZ_file.py
Feb 23 09:07:10 np0005626463.localdomain sudo[114944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:10 np0005626463.localdomain python3.9[114946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:10 np0005626463.localdomain sudo[114944]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:11 np0005626463.localdomain sudo[115036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrkztynawemugrmvlzdjgmjghdfbisxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837631.083797-558-158580476476395/AnsiballZ_file.py
Feb 23 09:07:11 np0005626463.localdomain sudo[115036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:11 np0005626463.localdomain python3.9[115038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:11 np0005626463.localdomain sudo[115036]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:11 np0005626463.localdomain sudo[115128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyvcxgkzendeykaicpvvdiieipvbziqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837631.6169322-558-147868090767427/AnsiballZ_file.py
Feb 23 09:07:11 np0005626463.localdomain sudo[115128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:12 np0005626463.localdomain python3.9[115130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:12 np0005626463.localdomain sudo[115128]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19753 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE00DC60000000001030307) 
Feb 23 09:07:12 np0005626463.localdomain sudo[115220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srnyucynfggbvogooxmjaompvtuqnbwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837632.209873-558-55385802468045/AnsiballZ_file.py
Feb 23 09:07:12 np0005626463.localdomain sudo[115220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:12 np0005626463.localdomain python3.9[115222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:12 np0005626463.localdomain sudo[115220]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:13 np0005626463.localdomain sudo[115312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-norijiskgmlbdgrrfpcarkyhheoelmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837632.9067218-558-121333102398418/AnsiballZ_file.py
Feb 23 09:07:13 np0005626463.localdomain sudo[115312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:13 np0005626463.localdomain python3.9[115314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:13 np0005626463.localdomain sudo[115312]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:13 np0005626463.localdomain sudo[115404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrbuzxyqtanlowbamycwmpljepsbrpor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837633.4740489-558-135850638704212/AnsiballZ_file.py
Feb 23 09:07:13 np0005626463.localdomain sudo[115404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:13 np0005626463.localdomain python3.9[115406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:13 np0005626463.localdomain sudo[115404]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:14 np0005626463.localdomain sudo[115496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxhajrbmnfgjegxnjscnqlykxqfzyyxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837634.0682456-558-197755091133949/AnsiballZ_file.py
Feb 23 09:07:14 np0005626463.localdomain sudo[115496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:14 np0005626463.localdomain python3.9[115498]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:14 np0005626463.localdomain sudo[115496]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:14 np0005626463.localdomain sudo[115588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nabgcqalmmxxcxzngpnlgjkweklssifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837634.690246-558-219742553750355/AnsiballZ_file.py
Feb 23 09:07:14 np0005626463.localdomain sudo[115588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38536 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=1941285649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE018060000000001030307) 
Feb 23 09:07:15 np0005626463.localdomain python3.9[115590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:15 np0005626463.localdomain sudo[115588]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:15 np0005626463.localdomain sudo[115680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjpsbocvkekzpnfuzexbghcvxdsoubyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837635.277074-558-270315478501988/AnsiballZ_file.py
Feb 23 09:07:15 np0005626463.localdomain sudo[115680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:15 np0005626463.localdomain python3.9[115682]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:15 np0005626463.localdomain sudo[115680]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:16 np0005626463.localdomain sudo[115772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvqfjjxvfecdznbotvphfdorstvsmcpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837635.910064-1008-172866040903042/AnsiballZ_file.py
Feb 23 09:07:16 np0005626463.localdomain sudo[115772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:16 np0005626463.localdomain python3.9[115774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:16 np0005626463.localdomain sudo[115772]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:16 np0005626463.localdomain sudo[115864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skabeyvcojodymdjsirsosncwsxghadj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837636.5169427-1008-42920254746188/AnsiballZ_file.py
Feb 23 09:07:16 np0005626463.localdomain sudo[115864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:16 np0005626463.localdomain python3.9[115866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:16 np0005626463.localdomain sudo[115864]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:17 np0005626463.localdomain sudo[115956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idsjtpnecvfrcearllugngeigkdzrxsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837637.0838451-1008-131209460798901/AnsiballZ_file.py
Feb 23 09:07:17 np0005626463.localdomain sudo[115956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:17 np0005626463.localdomain python3.9[115958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:17 np0005626463.localdomain sudo[115956]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:18 np0005626463.localdomain sudo[116048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osueqpjglgosulmggaklcdonjjrckmaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837637.9094028-1008-157822258976027/AnsiballZ_file.py
Feb 23 09:07:18 np0005626463.localdomain sudo[116048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:18 np0005626463.localdomain python3.9[116050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:18 np0005626463.localdomain sudo[116048]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43781 DF PROTO=TCP SPT=34978 DPT=9101 SEQ=819520041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE026070000000001030307) 
Feb 23 09:07:18 np0005626463.localdomain sudo[116140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugelvecqllipnyqaclhlckjohwmtucbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837638.480307-1008-153403726384677/AnsiballZ_file.py
Feb 23 09:07:18 np0005626463.localdomain sudo[116140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:18 np0005626463.localdomain python3.9[116142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:18 np0005626463.localdomain sudo[116140]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:19 np0005626463.localdomain sudo[116232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zscrrdvonkmcjytoigrrpsjoapizwjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837639.0019567-1008-63546910973456/AnsiballZ_file.py
Feb 23 09:07:19 np0005626463.localdomain sudo[116232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:19 np0005626463.localdomain python3.9[116234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:19 np0005626463.localdomain sudo[116232]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:19 np0005626463.localdomain sudo[116324]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhhmvpcqijyejbxarzjznslilbohrefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837639.6382494-1008-153831614006816/AnsiballZ_file.py
Feb 23 09:07:19 np0005626463.localdomain sudo[116324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:20 np0005626463.localdomain python3.9[116326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:20 np0005626463.localdomain sudo[116324]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:20 np0005626463.localdomain sudo[116416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkywdyhxgvbempmyrbtngahgebbnpiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837640.3633115-1008-72086589176625/AnsiballZ_file.py
Feb 23 09:07:20 np0005626463.localdomain sudo[116416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:20 np0005626463.localdomain python3.9[116418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:20 np0005626463.localdomain sudo[116416]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:21 np0005626463.localdomain sudo[116508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcgmnosxizxfxobwrytigltjgmcqjxck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837640.955686-1008-234561102583836/AnsiballZ_file.py
Feb 23 09:07:21 np0005626463.localdomain sudo[116508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:21 np0005626463.localdomain python3.9[116510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:21 np0005626463.localdomain sudo[116508]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:21 np0005626463.localdomain sudo[116600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zezxwybpjqerbztdklynrempzktarbqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837641.595523-1008-262826813548009/AnsiballZ_file.py
Feb 23 09:07:21 np0005626463.localdomain sudo[116600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:22 np0005626463.localdomain python3.9[116602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:22 np0005626463.localdomain sudo[116600]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:22 np0005626463.localdomain sudo[116692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmivaemucyebucfqroptyknsyssrljbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837642.202291-1008-256527500763958/AnsiballZ_file.py
Feb 23 09:07:22 np0005626463.localdomain sudo[116692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:22 np0005626463.localdomain python3.9[116694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:22 np0005626463.localdomain sudo[116692]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:23 np0005626463.localdomain sudo[116784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmgsdwhzjqlyushmcyknjeqkrpchxqte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837642.8835595-1008-253772648850524/AnsiballZ_file.py
Feb 23 09:07:23 np0005626463.localdomain sudo[116784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:23 np0005626463.localdomain python3.9[116786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:23 np0005626463.localdomain sudo[116784]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:24 np0005626463.localdomain sudo[116876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvxybksmiooidkvsntemdtqcdjirrxpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837643.5428195-1008-2550465274543/AnsiballZ_file.py
Feb 23 09:07:24 np0005626463.localdomain sudo[116876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36203 DF PROTO=TCP SPT=35118 DPT=9102 SEQ=663757124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE03B3F0000000001030307) 
Feb 23 09:07:24 np0005626463.localdomain python3.9[116878]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:24 np0005626463.localdomain sudo[116876]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:24 np0005626463.localdomain sudo[116968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqxmbncizhoiopgqcxajwcmsazwayynx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837644.352066-1008-194611920236087/AnsiballZ_file.py
Feb 23 09:07:24 np0005626463.localdomain sudo[116968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19755 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE03E060000000001030307) 
Feb 23 09:07:24 np0005626463.localdomain python3.9[116970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:24 np0005626463.localdomain sudo[116968]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:25 np0005626463.localdomain sudo[117060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoxzylklwjtubnhstmmmoibylnmuuqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837645.0914564-1008-234377670744701/AnsiballZ_file.py
Feb 23 09:07:25 np0005626463.localdomain sudo[117060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:25 np0005626463.localdomain python3.9[117062]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:25 np0005626463.localdomain sudo[117060]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:25 np0005626463.localdomain sudo[117152]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bywivunmqchzdeiyininylxxwcdjtemz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837645.7297184-1008-93760161748222/AnsiballZ_file.py
Feb 23 09:07:25 np0005626463.localdomain sudo[117152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:26 np0005626463.localdomain python3.9[117154]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:26 np0005626463.localdomain sudo[117152]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:26 np0005626463.localdomain sudo[117244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxcaajvpdinllcgjqixojrzsxsgibspp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837646.2979-1008-248878444538474/AnsiballZ_file.py
Feb 23 09:07:26 np0005626463.localdomain sudo[117244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:26 np0005626463.localdomain python3.9[117246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:26 np0005626463.localdomain sudo[117244]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:27 np0005626463.localdomain sudo[117336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcbigqhtlfueworjcsybbzwurewkiyok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837646.9092436-1008-24388253467718/AnsiballZ_file.py
Feb 23 09:07:27 np0005626463.localdomain sudo[117336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36205 DF PROTO=TCP SPT=35118 DPT=9102 SEQ=663757124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE047470000000001030307) 
Feb 23 09:07:27 np0005626463.localdomain python3.9[117338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:27 np0005626463.localdomain sudo[117336]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:27 np0005626463.localdomain sudo[117428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtzivjwyihxgxxqenridoixfqhzcswkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837647.5046957-1008-112275294823491/AnsiballZ_file.py
Feb 23 09:07:27 np0005626463.localdomain sudo[117428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:27 np0005626463.localdomain python3.9[117430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:28 np0005626463.localdomain sudo[117428]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:28 np0005626463.localdomain sudo[117520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmdbtadjejypucthqqnbupcdxqaoyuqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837648.116239-1008-205314570076265/AnsiballZ_file.py
Feb 23 09:07:28 np0005626463.localdomain sudo[117520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:28 np0005626463.localdomain python3.9[117522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:28 np0005626463.localdomain sudo[117520]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:28 np0005626463.localdomain sudo[117612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnmcvopnblbeobsuskvibtrhtsbxdgdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837648.7453017-1008-257511131605844/AnsiballZ_file.py
Feb 23 09:07:28 np0005626463.localdomain sudo[117612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:29 np0005626463.localdomain python3.9[117614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:07:29 np0005626463.localdomain sudo[117612]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:30 np0005626463.localdomain sudo[117704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfkvkeqkrpltafxsgiqtbvvgvkjqqjcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837649.4882452-1455-196235427154897/AnsiballZ_command.py
Feb 23 09:07:30 np0005626463.localdomain sudo[117704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:30 np0005626463.localdomain python3.9[117706]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:30 np0005626463.localdomain sudo[117704]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38180 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE055C60000000001030307) 
Feb 23 09:07:31 np0005626463.localdomain python3.9[117798]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:07:31 np0005626463.localdomain sudo[117888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dryjadlerdxhzdqvfqtzirfakdnbvkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837651.294864-1509-98728506101348/AnsiballZ_systemd_service.py
Feb 23 09:07:31 np0005626463.localdomain sudo[117888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:31 np0005626463.localdomain python3.9[117890]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:07:31 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:07:31 np0005626463.localdomain systemd-rc-local-generator[117916]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:07:31 np0005626463.localdomain systemd-sysv-generator[117919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:07:32 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:07:32 np0005626463.localdomain sudo[117888]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:32 np0005626463.localdomain sshd[117953]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:07:32 np0005626463.localdomain sudo[118017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxbhtobyuahfehmmnvqrzhffvroajqze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837652.5063562-1533-232310033774027/AnsiballZ_command.py
Feb 23 09:07:32 np0005626463.localdomain sudo[118017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38181 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE05DC60000000001030307) 
Feb 23 09:07:33 np0005626463.localdomain python3.9[118019]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:33 np0005626463.localdomain sudo[118017]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:33 np0005626463.localdomain sudo[118111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prnfzdbfgigxehmitxmoqmuovqcujtxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837653.1666844-1533-75229873441934/AnsiballZ_command.py
Feb 23 09:07:33 np0005626463.localdomain sudo[118111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:33 np0005626463.localdomain sshd[117953]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:07:33 np0005626463.localdomain python3.9[118113]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:33 np0005626463.localdomain sudo[118111]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:34 np0005626463.localdomain sudo[118204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trgynhebywhndhldrjbknccufldquoiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837653.8225927-1533-244374634245704/AnsiballZ_command.py
Feb 23 09:07:34 np0005626463.localdomain sudo[118204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:34 np0005626463.localdomain python3.9[118206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:34 np0005626463.localdomain sudo[118204]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:34 np0005626463.localdomain sudo[118297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxynthfpgaewdypmjstmpfpjoxpjznvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837654.3925924-1533-161732395749342/AnsiballZ_command.py
Feb 23 09:07:34 np0005626463.localdomain sudo[118297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:34 np0005626463.localdomain python3.9[118299]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:34 np0005626463.localdomain sudo[118297]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:35 np0005626463.localdomain sudo[118390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxkxwyjgysddbjqskivxyxprfroyurbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837654.9618685-1533-19626764527777/AnsiballZ_command.py
Feb 23 09:07:35 np0005626463.localdomain sudo[118390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:35 np0005626463.localdomain python3.9[118392]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:35 np0005626463.localdomain sudo[118390]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:35 np0005626463.localdomain sudo[118483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrlvsqjbzqqpyokvnaucbzaihzedrttb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837655.7017176-1533-221367182984884/AnsiballZ_command.py
Feb 23 09:07:35 np0005626463.localdomain sudo[118483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:36 np0005626463.localdomain python3.9[118485]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:36 np0005626463.localdomain sudo[118483]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65507 DF PROTO=TCP SPT=41786 DPT=9101 SEQ=1812879918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE06A860000000001030307) 
Feb 23 09:07:36 np0005626463.localdomain sudo[118576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyueodvsyqgaxpnyxqvvizbhzoyxkzlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837656.2193415-1533-8052477529396/AnsiballZ_command.py
Feb 23 09:07:36 np0005626463.localdomain sudo[118576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:36 np0005626463.localdomain python3.9[118578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:36 np0005626463.localdomain sudo[118576]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:36 np0005626463.localdomain sudo[118669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myeucqqroeppmlvzyuasrwyoddakzbar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837656.7285645-1533-145872326967406/AnsiballZ_command.py
Feb 23 09:07:36 np0005626463.localdomain sudo[118669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:37 np0005626463.localdomain python3.9[118671]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:37 np0005626463.localdomain sudo[118669]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:37 np0005626463.localdomain sudo[118762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nimknkogwduwhnugfepbpgznlrvlruki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837657.2836106-1533-52395247052706/AnsiballZ_command.py
Feb 23 09:07:37 np0005626463.localdomain sudo[118762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:37 np0005626463.localdomain python3.9[118764]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:37 np0005626463.localdomain sudo[118762]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:38 np0005626463.localdomain sudo[118855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imidtahkujfmakoyxwxcinpcacspekkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837657.819314-1533-23822642111242/AnsiballZ_command.py
Feb 23 09:07:38 np0005626463.localdomain sudo[118855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:38 np0005626463.localdomain python3.9[118857]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:38 np0005626463.localdomain sudo[118855]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:38 np0005626463.localdomain sudo[118948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyvmhsmdbmgqupimbeqrhlwygxfjqdjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837658.361281-1533-185899383212249/AnsiballZ_command.py
Feb 23 09:07:38 np0005626463.localdomain sudo[118948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:38 np0005626463.localdomain python3.9[118950]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:38 np0005626463.localdomain sudo[118948]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:39 np0005626463.localdomain sudo[119041]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpeuhkfxwosnovqewpcoulnsbnxtkpfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837658.87117-1533-253713037306428/AnsiballZ_command.py
Feb 23 09:07:39 np0005626463.localdomain sudo[119041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:39 np0005626463.localdomain python3.9[119043]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:39 np0005626463.localdomain sudo[119041]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3526 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE076F30000000001030307) 
Feb 23 09:07:39 np0005626463.localdomain sudo[119059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:07:39 np0005626463.localdomain sudo[119059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:07:39 np0005626463.localdomain sudo[119059]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:39 np0005626463.localdomain sudo[119103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:07:39 np0005626463.localdomain sudo[119103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:07:39 np0005626463.localdomain sudo[119164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnkenbgstkolhlyhgqwefhwborvrvzyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837659.4545057-1533-144406552823099/AnsiballZ_command.py
Feb 23 09:07:39 np0005626463.localdomain sudo[119164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:39 np0005626463.localdomain python3.9[119166]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:39 np0005626463.localdomain sudo[119164]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:40 np0005626463.localdomain sudo[119103]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:40 np0005626463.localdomain sudo[119290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzfrafmtmgiivkrihbzervykslbhpwxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837660.0493524-1533-239008106000175/AnsiballZ_command.py
Feb 23 09:07:40 np0005626463.localdomain sudo[119290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:40 np0005626463.localdomain python3.9[119292]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:40 np0005626463.localdomain sudo[119290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:41 np0005626463.localdomain sudo[119383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlepyxtnpopgsbavwlbauyvywhumxtgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837660.6223998-1533-245725982308066/AnsiballZ_command.py
Feb 23 09:07:41 np0005626463.localdomain sudo[119383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:41 np0005626463.localdomain python3.9[119385]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:41 np0005626463.localdomain sudo[119383]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:41 np0005626463.localdomain sudo[119476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naxqcommxjvhpplggcypnksfsyhlvszi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837661.4361372-1533-42899178429631/AnsiballZ_command.py
Feb 23 09:07:41 np0005626463.localdomain sudo[119476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:41 np0005626463.localdomain python3.9[119478]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:41 np0005626463.localdomain sudo[119476]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:42 np0005626463.localdomain sudo[119569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkwimrsaxzkgjwmiwzoeseknmdtlgpiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837662.0439987-1533-164820735892252/AnsiballZ_command.py
Feb 23 09:07:42 np0005626463.localdomain sudo[119569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3528 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE083060000000001030307) 
Feb 23 09:07:42 np0005626463.localdomain python3.9[119571]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:42 np0005626463.localdomain sudo[119569]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:42 np0005626463.localdomain sudo[119662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emiafqkcvqqlsvwvhrqbcjckntocobxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837662.6188722-1533-65812968171949/AnsiballZ_command.py
Feb 23 09:07:42 np0005626463.localdomain sudo[119662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:43 np0005626463.localdomain python3.9[119664]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:43 np0005626463.localdomain sudo[119662]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:43 np0005626463.localdomain sudo[119755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viiogmydsgtoapmtoqlcmowojsnopezf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837663.2389612-1533-85402298311575/AnsiballZ_command.py
Feb 23 09:07:43 np0005626463.localdomain sudo[119755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:43 np0005626463.localdomain python3.9[119757]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:43 np0005626463.localdomain sudo[119755]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:44 np0005626463.localdomain sudo[119818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:07:44 np0005626463.localdomain sudo[119818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:07:44 np0005626463.localdomain sudo[119818]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:44 np0005626463.localdomain sudo[119863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egcmactcdarkbvfywxiqhwowtfbxvkyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837663.8607972-1533-151823699467038/AnsiballZ_command.py
Feb 23 09:07:44 np0005626463.localdomain sudo[119863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:44 np0005626463.localdomain python3.9[119865]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:44 np0005626463.localdomain sudo[119863]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:44 np0005626463.localdomain sudo[119956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwujdmsjkdvhvvcsgsejbpdfqmgtaiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837664.444437-1533-71032897184172/AnsiballZ_command.py
Feb 23 09:07:44 np0005626463.localdomain sudo[119956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:07:44 np0005626463.localdomain python3.9[119958]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:07:44 np0005626463.localdomain sudo[119956]: pam_unix(sudo:session): session closed for user root
Feb 23 09:07:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38183 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE08E060000000001030307) 
Feb 23 09:07:46 np0005626463.localdomain sshd[107709]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:07:46 np0005626463.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Feb 23 09:07:46 np0005626463.localdomain systemd[1]: session-37.scope: Consumed 50.506s CPU time.
Feb 23 09:07:46 np0005626463.localdomain systemd-logind[759]: Session 37 logged out. Waiting for processes to exit.
Feb 23 09:07:46 np0005626463.localdomain systemd-logind[759]: Removed session 37.
Feb 23 09:07:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65509 DF PROTO=TCP SPT=41786 DPT=9101 SEQ=1812879918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE09A060000000001030307) 
Feb 23 09:07:48 np0005626463.localdomain sshd[119974]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:07:48 np0005626463.localdomain sshd[119974]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:07:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17603 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0B06F0000000001030307) 
Feb 23 09:07:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3530 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0B4070000000001030307) 
Feb 23 09:07:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17605 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0BC860000000001030307) 
Feb 23 09:08:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39098 DF PROTO=TCP SPT=56310 DPT=9100 SEQ=1508328302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0CB060000000001030307) 
Feb 23 09:08:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39099 DF PROTO=TCP SPT=56310 DPT=9100 SEQ=1508328302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0D3070000000001030307) 
Feb 23 09:08:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22098 DF PROTO=TCP SPT=46158 DPT=9101 SEQ=4033996518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0DFC70000000001030307) 
Feb 23 09:08:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17607 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0EC060000000001030307) 
Feb 23 09:08:12 np0005626463.localdomain sshd[119976]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:08:12 np0005626463.localdomain sshd[119976]: Accepted publickey for zuul from 192.168.122.30 port 43648 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:08:12 np0005626463.localdomain systemd-logind[759]: New session 38 of user zuul.
Feb 23 09:08:12 np0005626463.localdomain systemd[1]: Started Session 38 of User zuul.
Feb 23 09:08:12 np0005626463.localdomain sshd[119976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:08:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22274 DF PROTO=TCP SPT=45906 DPT=9882 SEQ=817962013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0F8460000000001030307) 
Feb 23 09:08:12 np0005626463.localdomain python3.9[120069]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 23 09:08:13 np0005626463.localdomain python3.9[120173]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:08:14 np0005626463.localdomain sudo[120263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzvhaatkqstlbaeyydeydmpudfqadxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837694.2852283-88-123165831819242/AnsiballZ_command.py
Feb 23 09:08:14 np0005626463.localdomain sudo[120263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:14 np0005626463.localdomain python3.9[120265]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:08:14 np0005626463.localdomain sudo[120263]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58633 DF PROTO=TCP SPT=55134 DPT=9105 SEQ=2418348777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE102060000000001030307) 
Feb 23 09:08:15 np0005626463.localdomain sshd[120281]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:08:15 np0005626463.localdomain sshd[120281]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:08:15 np0005626463.localdomain sudo[120358]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwwqvtzkzljhdijoatrbapeizwfisrwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837695.331153-124-7192857000497/AnsiballZ_stat.py
Feb 23 09:08:15 np0005626463.localdomain sudo[120358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:16 np0005626463.localdomain python3.9[120360]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:08:16 np0005626463.localdomain sudo[120358]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:16 np0005626463.localdomain sudo[120450]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkuzfhoqwjcpvthoqbiguhmpqiubtgqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837696.294642-148-111167283575789/AnsiballZ_file.py
Feb 23 09:08:16 np0005626463.localdomain sudo[120450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:16 np0005626463.localdomain python3.9[120452]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:08:16 np0005626463.localdomain sudo[120450]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:17 np0005626463.localdomain sudo[120542]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izahhlayxngydixscskirraesbvpnkzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837697.1066446-172-217122759841424/AnsiballZ_stat.py
Feb 23 09:08:17 np0005626463.localdomain sudo[120542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:17 np0005626463.localdomain python3.9[120544]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:08:17 np0005626463.localdomain sudo[120542]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:18 np0005626463.localdomain sudo[120615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvjfsarymfangjycouopbbbeylbynwra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837697.1066446-172-217122759841424/AnsiballZ_copy.py
Feb 23 09:08:18 np0005626463.localdomain sudo[120615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:18 np0005626463.localdomain python3.9[120617]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771837697.1066446-172-217122759841424/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:08:18 np0005626463.localdomain sudo[120615]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22100 DF PROTO=TCP SPT=46158 DPT=9101 SEQ=4033996518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE110060000000001030307) 
Feb 23 09:08:18 np0005626463.localdomain sudo[120707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jprkoycmxszaypgsvlvhckzejsmutvqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837698.4556909-217-144815971394365/AnsiballZ_setup.py
Feb 23 09:08:18 np0005626463.localdomain sudo[120707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:19 np0005626463.localdomain python3.9[120709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:08:19 np0005626463.localdomain sudo[120707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:19 np0005626463.localdomain sudo[120803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxmgovfizzddzphlqaymwxueszdxtbkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837699.5121117-241-37416754822455/AnsiballZ_file.py
Feb 23 09:08:19 np0005626463.localdomain sudo[120803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:20 np0005626463.localdomain python3.9[120805]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:08:20 np0005626463.localdomain sudo[120803]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:20 np0005626463.localdomain sudo[120895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eowftgotofblcrnbhngirbyaaablizhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837700.2114415-268-85079251384407/AnsiballZ_file.py
Feb 23 09:08:20 np0005626463.localdomain sudo[120895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:20 np0005626463.localdomain python3.9[120897]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:08:20 np0005626463.localdomain sudo[120895]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:21 np0005626463.localdomain python3.9[120987]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:08:21 np0005626463.localdomain network[121004]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:08:21 np0005626463.localdomain network[121005]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:08:21 np0005626463.localdomain network[121006]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:08:22 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:08:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46956 DF PROTO=TCP SPT=33590 DPT=9102 SEQ=1229274110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE125A00000000001030307) 
Feb 23 09:08:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22276 DF PROTO=TCP SPT=45906 DPT=9882 SEQ=817962013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE128060000000001030307) 
Feb 23 09:08:25 np0005626463.localdomain python3.9[121203]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:08:26 np0005626463.localdomain sshd[121294]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:08:26 np0005626463.localdomain python3.9[121293]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:08:26 np0005626463.localdomain sshd[121294]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:08:27 np0005626463.localdomain sudo[121389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaharywkoorlynylloiipklgwmfhmjgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837706.9023304-370-52186637215016/AnsiballZ_command.py
Feb 23 09:08:27 np0005626463.localdomain sudo[121389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:08:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46958 DF PROTO=TCP SPT=33590 DPT=9102 SEQ=1229274110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE131C70000000001030307) 
Feb 23 09:08:27 np0005626463.localdomain python3.9[121391]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:08:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5322 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE140060000000001030307) 
Feb 23 09:08:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5323 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE148060000000001030307) 
Feb 23 09:08:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21862 DF PROTO=TCP SPT=43206 DPT=9101 SEQ=3425392840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE155060000000001030307) 
Feb 23 09:08:37 np0005626463.localdomain sshd[45514]: Received signal 15; terminating.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: sshd.service: Consumed 13.222s CPU time.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 23 09:08:37 np0005626463.localdomain sshd[121434]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:08:37 np0005626463.localdomain sshd[121434]: Server listening on 0.0.0.0 port 22.
Feb 23 09:08:37 np0005626463.localdomain sshd[121434]: Server listening on :: port 22.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 09:08:37 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: run-r565896e9ac224709961ce98512c68dcf.service: Deactivated successfully.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: run-r7bf820a03f9a4132b58b917aec5b33a4.service: Deactivated successfully.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 23 09:08:38 np0005626463.localdomain sshd[121434]: Received signal 15; terminating.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 23 09:08:38 np0005626463.localdomain sshd[122114]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:08:38 np0005626463.localdomain sshd[122114]: Server listening on 0.0.0.0 port 22.
Feb 23 09:08:38 np0005626463.localdomain sshd[122114]: Server listening on :: port 22.
Feb 23 09:08:38 np0005626463.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 23 09:08:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37809 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE161560000000001030307) 
Feb 23 09:08:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37811 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE16D470000000001030307) 
Feb 23 09:08:44 np0005626463.localdomain sudo[122120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:08:44 np0005626463.localdomain sudo[122120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:08:44 np0005626463.localdomain sudo[122120]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:44 np0005626463.localdomain sudo[122135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:08:44 np0005626463.localdomain sudo[122135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:08:44 np0005626463.localdomain sudo[122135]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5325 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE178060000000001030307) 
Feb 23 09:08:45 np0005626463.localdomain sudo[122181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:08:45 np0005626463.localdomain sudo[122181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:08:45 np0005626463.localdomain sudo[122181]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:45 np0005626463.localdomain sudo[122196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:08:45 np0005626463.localdomain sudo[122196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:08:45 np0005626463.localdomain sudo[122196]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:48 np0005626463.localdomain sudo[122280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:08:48 np0005626463.localdomain sudo[122280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:08:48 np0005626463.localdomain sudo[122280]: pam_unix(sudo:session): session closed for user root
Feb 23 09:08:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21864 DF PROTO=TCP SPT=43206 DPT=9101 SEQ=3425392840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE186060000000001030307) 
Feb 23 09:08:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42056 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE19AD00000000001030307) 
Feb 23 09:08:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37813 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE19E070000000001030307) 
Feb 23 09:08:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42058 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1A6C60000000001030307) 
Feb 23 09:08:57 np0005626463.localdomain sshd[122339]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:09:00 np0005626463.localdomain sshd[122339]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:09:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1B5470000000001030307) 
Feb 23 09:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:09:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23208 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1BD460000000001030307) 
Feb 23 09:09:04 np0005626463.localdomain sshd[122357]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:09:04 np0005626463.localdomain sshd[122357]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:09:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7286 DF PROTO=TCP SPT=37554 DPT=9101 SEQ=2171848523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1CA470000000001030307) 
Feb 23 09:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:09:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42060 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1D6060000000001030307) 
Feb 23 09:09:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34131 DF PROTO=TCP SPT=43926 DPT=9882 SEQ=210262696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1E2870000000001030307) 
Feb 23 09:09:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23210 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1EE060000000001030307) 
Feb 23 09:09:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7288 DF PROTO=TCP SPT=37554 DPT=9101 SEQ=2171848523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1FA070000000001030307) 
Feb 23 09:09:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17243 DF PROTO=TCP SPT=48718 DPT=9102 SEQ=1691152045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE210000000000001030307) 
Feb 23 09:09:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34133 DF PROTO=TCP SPT=43926 DPT=9882 SEQ=210262696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE212060000000001030307) 
Feb 23 09:09:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17245 DF PROTO=TCP SPT=48718 DPT=9102 SEQ=1691152045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE21C060000000001030307) 
Feb 23 09:09:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11673 DF PROTO=TCP SPT=60216 DPT=9100 SEQ=2551074236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE22A860000000001030307) 
Feb 23 09:09:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11674 DF PROTO=TCP SPT=60216 DPT=9100 SEQ=2551074236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE232870000000001030307) 
Feb 23 09:09:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45617 DF PROTO=TCP SPT=47056 DPT=9101 SEQ=2913212424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE23F470000000001030307) 
Feb 23 09:09:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28896 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE24BB30000000001030307) 
Feb 23 09:09:41 np0005626463.localdomain sshd[122580]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:09:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28898 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE257C60000000001030307) 
Feb 23 09:09:43 np0005626463.localdomain sshd[122589]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:09:43 np0005626463.localdomain sshd[122589]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:09:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17225 DF PROTO=TCP SPT=46882 DPT=9105 SEQ=1727784005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE262060000000001030307) 
Feb 23 09:09:45 np0005626463.localdomain sshd[122580]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:09:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45619 DF PROTO=TCP SPT=47056 DPT=9101 SEQ=2913212424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE270060000000001030307) 
Feb 23 09:09:48 np0005626463.localdomain sudo[122616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:09:48 np0005626463.localdomain sudo[122616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:09:48 np0005626463.localdomain sudo[122616]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:49 np0005626463.localdomain sudo[122631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:09:49 np0005626463.localdomain sudo[122631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:09:49 np0005626463.localdomain sudo[122631]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:50 np0005626463.localdomain sudo[122678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:09:50 np0005626463.localdomain sudo[122678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:09:50 np0005626463.localdomain sudo[122678]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:50 np0005626463.localdomain sudo[122693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 09:09:50 np0005626463.localdomain sudo[122693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.782506241 +0000 UTC m=+0.080340562 container create 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Feb 23 09:09:50 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Feb 23 09:09:50 np0005626463.localdomain systemd[1]: Started libpod-conmon-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope.
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.750025796 +0000 UTC m=+0.047860137 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:09:50 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.86771132 +0000 UTC m=+0.165545641 container init 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main)
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.878286774 +0000 UTC m=+0.176121065 container start 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.878632815 +0000 UTC m=+0.176467136 container attach 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:09:50 np0005626463.localdomain priceless_ptolemy[122761]: 167 167
Feb 23 09:09:50 np0005626463.localdomain systemd[1]: libpod-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope: Deactivated successfully.
Feb 23 09:09:50 np0005626463.localdomain podman[122746]: 2026-02-23 09:09:50.883955048 +0000 UTC m=+0.181789389 container died 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, release=1770267347)
Feb 23 09:09:50 np0005626463.localdomain podman[122767]: 2026-02-23 09:09:50.981168466 +0000 UTC m=+0.082152517 container remove 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Feb 23 09:09:50 np0005626463.localdomain systemd[1]: libpod-conmon-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope: Deactivated successfully.
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 2026-02-23 09:09:51.207196449 +0000 UTC m=+0.070952745 container create 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main)
Feb 23 09:09:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope.
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 2026-02-23 09:09:51.164309504 +0000 UTC m=+0.028065830 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:09:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:09:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 09:09:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:09:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 2026-02-23 09:09:51.27648362 +0000 UTC m=+0.140239906 container init 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 2026-02-23 09:09:51.290780328 +0000 UTC m=+0.154536614 container start 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Feb 23 09:09:51 np0005626463.localdomain podman[122789]: 2026-02-23 09:09:51.29115519 +0000 UTC m=+0.154911476 container attach 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Feb 23 09:09:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-be90ae7fae200ac1f3bd074b36960bb9ba70b1e78bbd96c6196b1bd2f718c3f9-merged.mount: Deactivated successfully.
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]: [
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:     {
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "available": false,
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "ceph_device": false,
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "lsm_data": {},
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "lvs": [],
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "path": "/dev/sr0",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "rejected_reasons": [
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "Insufficient space (<5GB)",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "Has a FileSystem"
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         ],
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         "sys_api": {
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "actuators": null,
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "device_nodes": "sr0",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "human_readable_size": "482.00 KB",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "id_bus": "ata",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "model": "QEMU DVD-ROM",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "nr_requests": "2",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "partitions": {},
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "path": "/dev/sr0",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "removable": "1",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "rev": "2.5+",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "ro": "0",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "rotational": "1",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "sas_address": "",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "sas_device_handle": "",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "scheduler_mode": "mq-deadline",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "sectors": 0,
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "sectorsize": "2048",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "size": 493568.0,
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "support_discard": "0",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "type": "disk",
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:             "vendor": "QEMU"
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:         }
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]:     }
Feb 23 09:09:52 np0005626463.localdomain eloquent_sutherland[122806]: ]
Feb 23 09:09:52 np0005626463.localdomain systemd[1]: libpod-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope: Deactivated successfully.
Feb 23 09:09:52 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=16 res=1
Feb 23 09:09:52 np0005626463.localdomain podman[124397]: 2026-02-23 09:09:52.240448705 +0000 UTC m=+0.032409693 container died 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git)
Feb 23 09:09:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f-merged.mount: Deactivated successfully.
Feb 23 09:09:52 np0005626463.localdomain podman[124397]: 2026-02-23 09:09:52.283216955 +0000 UTC m=+0.075177963 container remove 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, architecture=x86_64, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:09:52 np0005626463.localdomain systemd[1]: libpod-conmon-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope: Deactivated successfully.
Feb 23 09:09:52 np0005626463.localdomain sudo[122693]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  Converting 2754 SID table entries...
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:09:52 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:09:53 np0005626463.localdomain sudo[124506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:09:53 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Feb 23 09:09:53 np0005626463.localdomain sudo[124506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:09:53 np0005626463.localdomain sudo[124506]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:54 np0005626463.localdomain sudo[121389]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34404 DF PROTO=TCP SPT=40142 DPT=9102 SEQ=420552745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE285300000000001030307) 
Feb 23 09:09:54 np0005626463.localdomain sudo[124611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqluzrlsbftlydhzrohmilxgxsivmyyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837794.3262365-397-57167252098049/AnsiballZ_file.py
Feb 23 09:09:54 np0005626463.localdomain sudo[124611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:09:54 np0005626463.localdomain python3.9[124613]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:09:54 np0005626463.localdomain sudo[124611]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28900 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE288070000000001030307) 
Feb 23 09:09:55 np0005626463.localdomain sudo[124703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmkawgqoejifuephszhdjagmwunnblsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837794.9916973-421-240879572633461/AnsiballZ_stat.py
Feb 23 09:09:55 np0005626463.localdomain sudo[124703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:09:55 np0005626463.localdomain python3.9[124705]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:09:55 np0005626463.localdomain sudo[124703]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:56 np0005626463.localdomain sudo[124776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygmbjemwvyucgcofjwqkzxqydzqhddys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837794.9916973-421-240879572633461/AnsiballZ_copy.py
Feb 23 09:09:56 np0005626463.localdomain sudo[124776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:09:56 np0005626463.localdomain python3.9[124778]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771837794.9916973-421-240879572633461/.source.fact _original_basename=.ytiar93n follow=False checksum=d686dccd4d8cd0883f3e3bc0a6f664c73290ba68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:09:56 np0005626463.localdomain sudo[124776]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:57 np0005626463.localdomain python3.9[124868]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:09:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34406 DF PROTO=TCP SPT=40142 DPT=9102 SEQ=420552745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE291460000000001030307) 
Feb 23 09:09:57 np0005626463.localdomain sudo[124964]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfjdaejrdxribpxxqfltjcowilipxwmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837797.5977242-496-70707127793854/AnsiballZ_setup.py
Feb 23 09:09:57 np0005626463.localdomain sudo[124964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:09:58 np0005626463.localdomain python3.9[124966]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:09:58 np0005626463.localdomain sudo[124964]: pam_unix(sudo:session): session closed for user root
Feb 23 09:09:59 np0005626463.localdomain sudo[125018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijufpirzrburemzvhpgnehuukqftthvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837797.5977242-496-70707127793854/AnsiballZ_dnf.py
Feb 23 09:09:59 np0005626463.localdomain sudo[125018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:09:59 np0005626463.localdomain python3.9[125020]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:10:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54588 DF PROTO=TCP SPT=44146 DPT=9100 SEQ=1842244641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE29FC60000000001030307) 
Feb 23 09:10:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54589 DF PROTO=TCP SPT=44146 DPT=9100 SEQ=1842244641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2A7C60000000001030307) 
Feb 23 09:10:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:10:03 np0005626463.localdomain systemd-rc-local-generator[125059]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:10:03 np0005626463.localdomain systemd-sysv-generator[125063]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:10:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:10:03 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 09:10:04 np0005626463.localdomain sudo[125018]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:04 np0005626463.localdomain sudo[125159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulvpanvxvydfipdvizznwiaapapuagoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837804.5475862-532-19101070892241/AnsiballZ_command.py
Feb 23 09:10:04 np0005626463.localdomain sudo[125159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:04 np0005626463.localdomain python3.9[125161]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:10:05 np0005626463.localdomain sudo[125159]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58805 DF PROTO=TCP SPT=45086 DPT=9101 SEQ=608469731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2B4860000000001030307) 
Feb 23 09:10:06 np0005626463.localdomain sudo[125398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwaayxvkmmvlxfnjrohqamtsbnkiasog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837806.0367112-556-173311213079/AnsiballZ_selinux.py
Feb 23 09:10:06 np0005626463.localdomain sudo[125398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:06 np0005626463.localdomain python3.9[125400]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 23 09:10:06 np0005626463.localdomain sudo[125398]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:07 np0005626463.localdomain sudo[125490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opcqixenmtslpcvfxdiwhiawvkltjjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837807.3167918-589-716539246968/AnsiballZ_command.py
Feb 23 09:10:07 np0005626463.localdomain sudo[125490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:07 np0005626463.localdomain python3.9[125492]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 23 09:10:08 np0005626463.localdomain sudo[125490]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:08 np0005626463.localdomain sudo[125583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdvdrzvdaysnlsazzntmvcnyloqfnavo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837808.461645-613-31949127641227/AnsiballZ_file.py
Feb 23 09:10:08 np0005626463.localdomain sudo[125583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:08 np0005626463.localdomain python3.9[125585]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:10:08 np0005626463.localdomain sudo[125583]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46372 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2C0E20000000001030307) 
Feb 23 09:10:09 np0005626463.localdomain sudo[125675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijinxkrhjuzimnptznqbouvwqevraeyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837809.1463206-637-17856187721520/AnsiballZ_mount.py
Feb 23 09:10:09 np0005626463.localdomain sudo[125675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:09 np0005626463.localdomain python3.9[125677]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 23 09:10:09 np0005626463.localdomain sudo[125675]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:11 np0005626463.localdomain sudo[125767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofgegtczjkshncnnmmifzretybthusdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837810.8873804-721-23268730276969/AnsiballZ_file.py
Feb 23 09:10:11 np0005626463.localdomain sudo[125767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:11 np0005626463.localdomain python3.9[125769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:10:11 np0005626463.localdomain sudo[125767]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:11 np0005626463.localdomain sudo[125859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsmqdvfhnrcjcabspnvddnyqlgnaencw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837811.5468314-745-13721008793683/AnsiballZ_stat.py
Feb 23 09:10:11 np0005626463.localdomain sudo[125859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:12 np0005626463.localdomain python3.9[125861]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:10:12 np0005626463.localdomain sudo[125859]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:12 np0005626463.localdomain sudo[125932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtgqcybmokjlxbbjvctlghhqgzxtlnne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837811.5468314-745-13721008793683/AnsiballZ_copy.py
Feb 23 09:10:12 np0005626463.localdomain sudo[125932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46374 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2CD070000000001030307) 
Feb 23 09:10:12 np0005626463.localdomain python3.9[125934]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771837811.5468314-745-13721008793683/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:10:12 np0005626463.localdomain sudo[125932]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:13 np0005626463.localdomain sudo[126024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfazghjawtltkcelhpowuxcidbfzucel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837813.3316834-817-120760090149485/AnsiballZ_stat.py
Feb 23 09:10:13 np0005626463.localdomain sudo[126024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:13 np0005626463.localdomain python3.9[126026]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:10:13 np0005626463.localdomain sudo[126024]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:14 np0005626463.localdomain sudo[126118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udfxkooihuuceotzymjysrowuaogrdgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837814.385538-856-11072026234379/AnsiballZ_getent.py
Feb 23 09:10:14 np0005626463.localdomain sudo[126118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:15 np0005626463.localdomain python3.9[126120]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 23 09:10:15 np0005626463.localdomain sudo[126118]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10160 DF PROTO=TCP SPT=52164 DPT=9105 SEQ=1394534134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2D8060000000001030307) 
Feb 23 09:10:15 np0005626463.localdomain sudo[126211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoicloodenhkhnzhczpvxtajasnbsocg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837815.470193-886-202627904624512/AnsiballZ_getent.py
Feb 23 09:10:15 np0005626463.localdomain sudo[126211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:15 np0005626463.localdomain python3.9[126213]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 23 09:10:15 np0005626463.localdomain sudo[126211]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:16 np0005626463.localdomain sudo[126304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sejvgjrcfqjshuyfdzeyagbxthcchvsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837816.1523669-910-53676419520014/AnsiballZ_group.py
Feb 23 09:10:16 np0005626463.localdomain sudo[126304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:16 np0005626463.localdomain python3.9[126306]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 09:10:16 np0005626463.localdomain groupmod[126307]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Feb 23 09:10:16 np0005626463.localdomain groupmod[126307]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Feb 23 09:10:16 np0005626463.localdomain sudo[126304]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:17 np0005626463.localdomain sudo[126402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkvmtqzkojathwyilxqmgttainajujrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837817.0912313-937-269591555366554/AnsiballZ_file.py
Feb 23 09:10:17 np0005626463.localdomain sudo[126402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:17 np0005626463.localdomain python3.9[126404]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 23 09:10:17 np0005626463.localdomain sudo[126402]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:18 np0005626463.localdomain sudo[126494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnldiifxswxwjvynklzxbexhxazapzgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837817.9582236-970-42514059212308/AnsiballZ_dnf.py
Feb 23 09:10:18 np0005626463.localdomain sudo[126494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58807 DF PROTO=TCP SPT=45086 DPT=9101 SEQ=608469731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2E4060000000001030307) 
Feb 23 09:10:18 np0005626463.localdomain python3.9[126496]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:10:21 np0005626463.localdomain sudo[126494]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:22 np0005626463.localdomain sudo[126588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcklnukwtnsftelnxmryhbnprnfpofsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837821.9011807-994-263393903559164/AnsiballZ_file.py
Feb 23 09:10:22 np0005626463.localdomain sudo[126588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:22 np0005626463.localdomain python3.9[126590]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:10:22 np0005626463.localdomain sudo[126588]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:23 np0005626463.localdomain sshd[126605]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:10:23 np0005626463.localdomain sshd[126607]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:10:23 np0005626463.localdomain sshd[126605]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:10:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6766 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2FA600000000001030307) 
Feb 23 09:10:24 np0005626463.localdomain sshd[126607]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:10:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46376 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2FE070000000001030307) 
Feb 23 09:10:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6768 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE306870000000001030307) 
Feb 23 09:10:27 np0005626463.localdomain sudo[126684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kufdyyzocmvucpwckjgwtsqgsmpyqvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837827.3949254-1018-196136338400691/AnsiballZ_stat.py
Feb 23 09:10:27 np0005626463.localdomain sudo[126684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:27 np0005626463.localdomain python3.9[126686]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:10:27 np0005626463.localdomain sudo[126684]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:28 np0005626463.localdomain sudo[126757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuqjalysjevusdxmgfafzowfdsxpwewb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837827.3949254-1018-196136338400691/AnsiballZ_copy.py
Feb 23 09:10:28 np0005626463.localdomain sudo[126757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:28 np0005626463.localdomain python3.9[126759]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837827.3949254-1018-196136338400691/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:10:28 np0005626463.localdomain sudo[126757]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:29 np0005626463.localdomain sudo[126849]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amktuhrfclddjgtvhchkmgqnyxbnrzlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837828.662337-1063-80688360803655/AnsiballZ_systemd.py
Feb 23 09:10:29 np0005626463.localdomain sudo[126849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:29 np0005626463.localdomain python3.9[126851]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:10:29 np0005626463.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 09:10:29 np0005626463.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 23 09:10:29 np0005626463.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 23 09:10:29 np0005626463.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 23 09:10:29 np0005626463.localdomain systemd-modules-load[126855]: Module 'msr' is built in
Feb 23 09:10:29 np0005626463.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 23 09:10:29 np0005626463.localdomain sudo[126849]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27787 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE314C70000000001030307) 
Feb 23 09:10:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27788 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE31CC60000000001030307) 
Feb 23 09:10:33 np0005626463.localdomain sudo[126945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcvdfcxgeofdledfscsgkqucngavoody ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837833.33873-1087-220998321439589/AnsiballZ_stat.py
Feb 23 09:10:33 np0005626463.localdomain sudo[126945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:33 np0005626463.localdomain python3.9[126947]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:10:33 np0005626463.localdomain sudo[126945]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:34 np0005626463.localdomain sudo[127018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sojaddsbaarrlkelmfpvucpvxdonizzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837833.33873-1087-220998321439589/AnsiballZ_copy.py
Feb 23 09:10:34 np0005626463.localdomain sudo[127018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:34 np0005626463.localdomain python3.9[127020]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837833.33873-1087-220998321439589/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:10:34 np0005626463.localdomain sudo[127018]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:35 np0005626463.localdomain sudo[127110]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzozadpcdhnbcakbsiietimqeexsrkcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837834.9555755-1141-67879225092380/AnsiballZ_dnf.py
Feb 23 09:10:35 np0005626463.localdomain sudo[127110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:35 np0005626463.localdomain python3.9[127112]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:10:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29067 DF PROTO=TCP SPT=41922 DPT=9101 SEQ=3475271719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE329C60000000001030307) 
Feb 23 09:10:38 np0005626463.localdomain sudo[127110]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6770 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE336070000000001030307) 
Feb 23 09:10:39 np0005626463.localdomain python3.9[127204]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:10:40 np0005626463.localdomain python3.9[127296]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 23 09:10:40 np0005626463.localdomain python3.9[127386]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:10:41 np0005626463.localdomain sudo[127476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnjumvjpisgxcnytvcptrepbelklllkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837841.4889555-1264-148603474288251/AnsiballZ_systemd.py
Feb 23 09:10:41 np0005626463.localdomain sudo[127476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:42 np0005626463.localdomain python3.9[127478]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:10:42 np0005626463.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 23 09:10:42 np0005626463.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 23 09:10:42 np0005626463.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 23 09:10:42 np0005626463.localdomain systemd[1]: tuned.service: Consumed 2.026s CPU time, no IO.
Feb 23 09:10:42 np0005626463.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 23 09:10:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62012 DF PROTO=TCP SPT=39948 DPT=9882 SEQ=889338140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE342060000000001030307) 
Feb 23 09:10:43 np0005626463.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 23 09:10:43 np0005626463.localdomain sudo[127476]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:44 np0005626463.localdomain python3.9[127580]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 23 09:10:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27790 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE34C060000000001030307) 
Feb 23 09:10:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29069 DF PROTO=TCP SPT=41922 DPT=9101 SEQ=3475271719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE35A070000000001030307) 
Feb 23 09:10:51 np0005626463.localdomain sudo[127670]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yudfpfshhvazoapyedggvllxfiffxvjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837851.252822-1435-6397032782623/AnsiballZ_systemd.py
Feb 23 09:10:51 np0005626463.localdomain sudo[127670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:51 np0005626463.localdomain python3.9[127672]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:10:51 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:10:52 np0005626463.localdomain systemd-rc-local-generator[127701]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:10:52 np0005626463.localdomain systemd-sysv-generator[127705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:10:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:10:52 np0005626463.localdomain sudo[127670]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:52 np0005626463.localdomain sudo[127800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmjzhohirvefrdybdvdxqagizcuafzbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837852.3313403-1435-135971910728855/AnsiballZ_systemd.py
Feb 23 09:10:52 np0005626463.localdomain sudo[127800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:52 np0005626463.localdomain python3.9[127802]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:10:53 np0005626463.localdomain sudo[127804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:10:53 np0005626463.localdomain sudo[127804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:10:53 np0005626463.localdomain sudo[127804]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:53 np0005626463.localdomain sudo[127819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:10:53 np0005626463.localdomain sudo[127819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:10:53 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:10:54 np0005626463.localdomain systemd-rc-local-generator[127877]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:10:54 np0005626463.localdomain systemd-sysv-generator[127880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:10:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33995 DF PROTO=TCP SPT=44786 DPT=9102 SEQ=2922352443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE36F900000000001030307) 
Feb 23 09:10:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:10:54 np0005626463.localdomain sudo[127819]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:54 np0005626463.localdomain sudo[127800]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62014 DF PROTO=TCP SPT=39948 DPT=9882 SEQ=889338140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE372070000000001030307) 
Feb 23 09:10:54 np0005626463.localdomain sudo[127989]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyjklygsxqkfwdjluuwzkafipisaelag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837854.585691-1483-4434976529289/AnsiballZ_command.py
Feb 23 09:10:54 np0005626463.localdomain sudo[127989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:55 np0005626463.localdomain python3.9[127991]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:10:55 np0005626463.localdomain sudo[127989]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:55 np0005626463.localdomain sudo[128082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yseygxtzmobrknispzrcsacqdwsczlru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837855.232893-1507-111267127093995/AnsiballZ_command.py
Feb 23 09:10:55 np0005626463.localdomain sudo[128082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:55 np0005626463.localdomain python3.9[128084]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:10:55 np0005626463.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Feb 23 09:10:55 np0005626463.localdomain sudo[128082]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:56 np0005626463.localdomain sudo[128175]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avkcqlbylnuxdrdpmkfhbcsclstjglat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837856.4870245-1531-251171960470375/AnsiballZ_command.py
Feb 23 09:10:56 np0005626463.localdomain sudo[128175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:56 np0005626463.localdomain sudo[128178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:10:56 np0005626463.localdomain sudo[128178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:10:56 np0005626463.localdomain sudo[128178]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:56 np0005626463.localdomain python3.9[128177]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:10:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33997 DF PROTO=TCP SPT=44786 DPT=9102 SEQ=2922352443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE37B860000000001030307) 
Feb 23 09:10:58 np0005626463.localdomain sudo[128175]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:58 np0005626463.localdomain sudo[128289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsijpsbuhswmiecxaemfptjoanrcwmer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837858.212067-1555-127126394242968/AnsiballZ_command.py
Feb 23 09:10:58 np0005626463.localdomain sudo[128289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:58 np0005626463.localdomain python3.9[128291]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:10:58 np0005626463.localdomain sudo[128289]: pam_unix(sudo:session): session closed for user root
Feb 23 09:10:59 np0005626463.localdomain sudo[128382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fucqnmpvniqczljuojgciagoowfqfboc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837858.9724135-1579-73667724428672/AnsiballZ_systemd.py
Feb 23 09:10:59 np0005626463.localdomain sudo[128382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:10:59 np0005626463.localdomain python3.9[128384]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 23 09:10:59 np0005626463.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 23 09:10:59 np0005626463.localdomain sudo[128382]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:00 np0005626463.localdomain sshd[119976]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:11:00 np0005626463.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Feb 23 09:11:00 np0005626463.localdomain systemd[1]: session-38.scope: Consumed 2min 2.658s CPU time.
Feb 23 09:11:00 np0005626463.localdomain systemd-logind[759]: Session 38 logged out. Waiting for processes to exit.
Feb 23 09:11:00 np0005626463.localdomain systemd-logind[759]: Removed session 38.
Feb 23 09:11:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45449 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE38A060000000001030307) 
Feb 23 09:11:02 np0005626463.localdomain sshd[128404]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:11:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45450 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE392060000000001030307) 
Feb 23 09:11:03 np0005626463.localdomain sshd[128404]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:11:04 np0005626463.localdomain sshd[128406]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:11:05 np0005626463.localdomain sshd[128406]: Accepted publickey for zuul from 192.168.122.30 port 54730 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:11:05 np0005626463.localdomain systemd-logind[759]: New session 39 of user zuul.
Feb 23 09:11:05 np0005626463.localdomain systemd[1]: Started Session 39 of User zuul.
Feb 23 09:11:05 np0005626463.localdomain sshd[128406]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:11:06 np0005626463.localdomain python3.9[128499]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:11:06 np0005626463.localdomain sshd[128504]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:11:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28357 DF PROTO=TCP SPT=59532 DPT=9101 SEQ=1427864577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE39F060000000001030307) 
Feb 23 09:11:07 np0005626463.localdomain python3.9[128594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:11:08 np0005626463.localdomain sudo[128688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhhqjazlwhmpucjzfmctrpudehiswabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837868.0199196-105-141999164997155/AnsiballZ_command.py
Feb 23 09:11:08 np0005626463.localdomain sudo[128688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:08 np0005626463.localdomain python3.9[128690]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:11:08 np0005626463.localdomain sudo[128688]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20817 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3AB430000000001030307) 
Feb 23 09:11:09 np0005626463.localdomain python3.9[128781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:11:10 np0005626463.localdomain sudo[128875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znyirnojptyipocmrcllyoehxzbcmvvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837869.9964728-165-12524328648330/AnsiballZ_setup.py
Feb 23 09:11:10 np0005626463.localdomain sudo[128875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:10 np0005626463.localdomain python3.9[128877]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:11:10 np0005626463.localdomain sudo[128875]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:11 np0005626463.localdomain sudo[128929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gljzthhjtgewtahelzwzarutqtrpeexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837869.9964728-165-12524328648330/AnsiballZ_dnf.py
Feb 23 09:11:11 np0005626463.localdomain sudo[128929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:11 np0005626463.localdomain python3.9[128931]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:11:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20819 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3B7460000000001030307) 
Feb 23 09:11:13 np0005626463.localdomain sshd[128504]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:11:15 np0005626463.localdomain sudo[128929]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45452 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3C2060000000001030307) 
Feb 23 09:11:15 np0005626463.localdomain sudo[129024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwfagxctdsvgetzflnpfoubevvplxjfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837875.1730392-201-57756057276114/AnsiballZ_setup.py
Feb 23 09:11:15 np0005626463.localdomain sudo[129024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:15 np0005626463.localdomain python3.9[129026]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:11:16 np0005626463.localdomain sudo[129024]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:16 np0005626463.localdomain sudo[129179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phbeizwdugapaqerbowzborhzdutblcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837876.4164472-234-160733684173515/AnsiballZ_file.py
Feb 23 09:11:16 np0005626463.localdomain sudo[129179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:17 np0005626463.localdomain python3.9[129181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:11:17 np0005626463.localdomain sudo[129179]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:17 np0005626463.localdomain sudo[129271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oprbleigvaptmivzzivirpffngzwnqdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837877.168762-258-176664961479336/AnsiballZ_command.py
Feb 23 09:11:17 np0005626463.localdomain sudo[129271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:17 np0005626463.localdomain python3.9[129273]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:11:17 np0005626463.localdomain sudo[129271]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:18 np0005626463.localdomain sudo[129375]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzojzhhqarxtsbzdlqevdxtpsphdgobx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837877.8676488-282-63389572499654/AnsiballZ_stat.py
Feb 23 09:11:18 np0005626463.localdomain sudo[129375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:18 np0005626463.localdomain python3.9[129377]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:11:18 np0005626463.localdomain sudo[129375]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:18 np0005626463.localdomain sudo[129423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iizyzxxhsphnzdptiamuzfabcpfvxfkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837877.8676488-282-63389572499654/AnsiballZ_file.py
Feb 23 09:11:18 np0005626463.localdomain sudo[129423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28359 DF PROTO=TCP SPT=59532 DPT=9101 SEQ=1427864577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3D0060000000001030307) 
Feb 23 09:11:18 np0005626463.localdomain python3.9[129425]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:11:18 np0005626463.localdomain sudo[129423]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:19 np0005626463.localdomain sudo[129515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwautukdyfneiskssbpejhgkprytsadb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837879.155516-318-267515019269428/AnsiballZ_stat.py
Feb 23 09:11:19 np0005626463.localdomain sudo[129515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:19 np0005626463.localdomain python3.9[129517]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:11:19 np0005626463.localdomain sudo[129515]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:20 np0005626463.localdomain sudo[129588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osdjsdofovzjkvsqhkrsrqawpiniserj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837879.155516-318-267515019269428/AnsiballZ_copy.py
Feb 23 09:11:20 np0005626463.localdomain sudo[129588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:20 np0005626463.localdomain python3.9[129590]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837879.155516-318-267515019269428/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:11:20 np0005626463.localdomain sudo[129588]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:20 np0005626463.localdomain sudo[129680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iksqcfwmnyhhskaaimpljrvwagubbptm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837880.525868-366-33590148351965/AnsiballZ_ini_file.py
Feb 23 09:11:20 np0005626463.localdomain sudo[129680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:21 np0005626463.localdomain python3.9[129682]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:11:21 np0005626463.localdomain sudo[129680]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:21 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 23 09:11:21 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:11:21 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:11:21 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:11:21 np0005626463.localdomain sudo[129773]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqijqpettolzfcnhczpdzdznerpeqhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837881.2733557-366-274413351477081/AnsiballZ_ini_file.py
Feb 23 09:11:21 np0005626463.localdomain sudo[129773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:21 np0005626463.localdomain python3.9[129775]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:11:21 np0005626463.localdomain sudo[129773]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:22 np0005626463.localdomain sudo[129865]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyycaqlwqyxgnxswoydjhrmankyvpqzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837881.8859134-366-220346753997716/AnsiballZ_ini_file.py
Feb 23 09:11:22 np0005626463.localdomain sudo[129865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:22 np0005626463.localdomain python3.9[129867]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:11:22 np0005626463.localdomain sudo[129865]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:22 np0005626463.localdomain sudo[129957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzkoxenvlgrydlnbivryeqhicbmmhjqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837882.5085926-366-106682353558319/AnsiballZ_ini_file.py
Feb 23 09:11:22 np0005626463.localdomain sudo[129957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:23 np0005626463.localdomain python3.9[129959]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:11:23 np0005626463.localdomain sudo[129957]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:23 np0005626463.localdomain python3.9[130049]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:11:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20149 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3E4C10000000001030307) 
Feb 23 09:11:24 np0005626463.localdomain sudo[130141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptshqbcadabgircswqhigarlubetjlsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837884.1496327-486-251800340846324/AnsiballZ_dnf.py
Feb 23 09:11:24 np0005626463.localdomain sudo[130141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:24 np0005626463.localdomain python3.9[130143]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20821 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3E8070000000001030307) 
Feb 23 09:11:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20151 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3F0C60000000001030307) 
Feb 23 09:11:28 np0005626463.localdomain sudo[130141]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:28 np0005626463.localdomain sudo[130235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyasdzdyvmvepjuapsnvwfrnqqclozmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837888.224783-510-80688269724309/AnsiballZ_dnf.py
Feb 23 09:11:28 np0005626463.localdomain sudo[130235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:28 np0005626463.localdomain python3.9[130237]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2677 DF PROTO=TCP SPT=59320 DPT=9100 SEQ=600894686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3FF460000000001030307) 
Feb 23 09:11:31 np0005626463.localdomain sudo[130235]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:32 np0005626463.localdomain sudo[130329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhhyqdxnwkmuswztpfpjeordtbkwefds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837892.5266826-540-216265751917733/AnsiballZ_dnf.py
Feb 23 09:11:32 np0005626463.localdomain sudo[130329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2678 DF PROTO=TCP SPT=59320 DPT=9100 SEQ=600894686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE407460000000001030307) 
Feb 23 09:11:33 np0005626463.localdomain python3.9[130331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32817 DF PROTO=TCP SPT=45238 DPT=9101 SEQ=2992531929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE414060000000001030307) 
Feb 23 09:11:36 np0005626463.localdomain sudo[130329]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:37 np0005626463.localdomain sudo[130429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbwppqmozlkwtyugtfqgjhngihvdzxmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837896.767083-567-59244621716223/AnsiballZ_dnf.py
Feb 23 09:11:37 np0005626463.localdomain sudo[130429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:37 np0005626463.localdomain python3.9[130431]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20153 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE420070000000001030307) 
Feb 23 09:11:40 np0005626463.localdomain sudo[130429]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:41 np0005626463.localdomain sudo[130523]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htkrzodtrbdzzduqgznhqejydwfwpnhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837901.0340917-603-195398427854438/AnsiballZ_dnf.py
Feb 23 09:11:41 np0005626463.localdomain sudo[130523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:41 np0005626463.localdomain python3.9[130525]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:42 np0005626463.localdomain sshd[130528]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:11:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38399 DF PROTO=TCP SPT=58728 DPT=9882 SEQ=903369776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE42C870000000001030307) 
Feb 23 09:11:42 np0005626463.localdomain sshd[130528]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:11:43 np0005626463.localdomain sshd[130530]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:11:44 np0005626463.localdomain sshd[130530]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:11:44 np0005626463.localdomain sudo[130523]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27552 DF PROTO=TCP SPT=41184 DPT=9105 SEQ=3035751851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE438060000000001030307) 
Feb 23 09:11:45 np0005626463.localdomain sudo[130621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhlqwniqfyewagojjchgtgkasflquxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837905.5082366-630-110759959392533/AnsiballZ_dnf.py
Feb 23 09:11:45 np0005626463.localdomain sudo[130621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:46 np0005626463.localdomain python3.9[130623]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32819 DF PROTO=TCP SPT=45238 DPT=9101 SEQ=2992531929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE444060000000001030307) 
Feb 23 09:11:49 np0005626463.localdomain sudo[130621]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:49 np0005626463.localdomain sudo[130715]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdlhxgjkyfodedofihzwtclcurlqtwmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837909.5533338-657-195081171316219/AnsiballZ_dnf.py
Feb 23 09:11:49 np0005626463.localdomain sudo[130715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:11:50 np0005626463.localdomain python3.9[130717]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:11:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6599 DF PROTO=TCP SPT=52348 DPT=9102 SEQ=3188560332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE459F00000000001030307) 
Feb 23 09:11:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38401 DF PROTO=TCP SPT=58728 DPT=9882 SEQ=903369776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE45C060000000001030307) 
Feb 23 09:11:57 np0005626463.localdomain sudo[130729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:11:57 np0005626463.localdomain sudo[130729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:11:57 np0005626463.localdomain sudo[130729]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:57 np0005626463.localdomain sudo[130744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:11:57 np0005626463.localdomain sudo[130744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:11:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6601 DF PROTO=TCP SPT=52348 DPT=9102 SEQ=3188560332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE466070000000001030307) 
Feb 23 09:11:57 np0005626463.localdomain sudo[130744]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:57 np0005626463.localdomain sudo[130780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:11:57 np0005626463.localdomain sudo[130780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:11:57 np0005626463.localdomain sudo[130780]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:57 np0005626463.localdomain sudo[130795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:11:57 np0005626463.localdomain sudo[130795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:11:58 np0005626463.localdomain sudo[130795]: pam_unix(sudo:session): session closed for user root
Feb 23 09:11:58 np0005626463.localdomain sudo[130844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:11:58 np0005626463.localdomain sudo[130844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:11:58 np0005626463.localdomain sudo[130844]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52695 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE474870000000001030307) 
Feb 23 09:12:02 np0005626463.localdomain sudo[130715]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:02 np0005626463.localdomain sudo[131012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utgzooiuujnwnsmxbhjmtpocglbepkxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837922.578615-684-123406803717486/AnsiballZ_dnf.py
Feb 23 09:12:02 np0005626463.localdomain sudo[131012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52696 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE47C860000000001030307) 
Feb 23 09:12:03 np0005626463.localdomain python3.9[131014]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:12:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30765 DF PROTO=TCP SPT=37558 DPT=9101 SEQ=67614132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE489460000000001030307) 
Feb 23 09:12:06 np0005626463.localdomain sudo[131012]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:07 np0005626463.localdomain sudo[131107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozfvekmrebozogfeyrczyacraudjermd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837926.9369028-714-197102894289221/AnsiballZ_dnf.py
Feb 23 09:12:07 np0005626463.localdomain sudo[131107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:07 np0005626463.localdomain python3.9[131109]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:12:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32703 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE495A30000000001030307) 
Feb 23 09:12:10 np0005626463.localdomain sudo[131107]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:11 np0005626463.localdomain sudo[131205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzejspdjgluzxdrvpebqclkjejhfqnuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837931.2803333-747-68534741660259/AnsiballZ_file.py
Feb 23 09:12:11 np0005626463.localdomain sudo[131205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:11 np0005626463.localdomain python3.9[131207]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:12:11 np0005626463.localdomain sudo[131205]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:12 np0005626463.localdomain sudo[131310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vobcrxmwewkgcsvyzmusapgwevswkidg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837931.9345686-771-41388446962518/AnsiballZ_stat.py
Feb 23 09:12:12 np0005626463.localdomain sudo[131310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:12 np0005626463.localdomain python3.9[131312]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:12:12 np0005626463.localdomain sudo[131310]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:12 np0005626463.localdomain auditd[725]: Audit daemon rotating log files
Feb 23 09:12:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32705 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4A1C60000000001030307) 
Feb 23 09:12:12 np0005626463.localdomain sudo[131383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xotnzgjfzhdoxcrfukkwtgbtdsweteba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837931.9345686-771-41388446962518/AnsiballZ_copy.py
Feb 23 09:12:12 np0005626463.localdomain sudo[131383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:12 np0005626463.localdomain python3.9[131385]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771837931.9345686-771-41388446962518/.source.json _original_basename=.dlqjd6bq follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:12:12 np0005626463.localdomain sudo[131383]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:13 np0005626463.localdomain sudo[131475]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bezuzpasyhnpuriufzemiipbqrsmcsiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837933.311815-825-12037193095791/AnsiballZ_podman_image.py
Feb 23 09:12:13 np0005626463.localdomain sudo[131475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:13 np0005626463.localdomain python3.9[131477]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52698 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4AC060000000001030307) 
Feb 23 09:12:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30767 DF PROTO=TCP SPT=37558 DPT=9101 SEQ=67614132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4BA070000000001030307) 
Feb 23 09:12:20 np0005626463.localdomain podman[131491]: 2026-02-23 09:12:14.063904368 +0000 UTC m=+0.042549986 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 09:12:20 np0005626463.localdomain sudo[131475]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:21 np0005626463.localdomain sudo[131690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeosdqlxhnrhccjotreocvcosreditof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837940.9598806-858-205601979021697/AnsiballZ_podman_image.py
Feb 23 09:12:21 np0005626463.localdomain sudo[131690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:21 np0005626463.localdomain python3.9[131692]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:21 np0005626463.localdomain sshd[131719]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:12:22 np0005626463.localdomain sshd[131719]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:12:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59992 DF PROTO=TCP SPT=57914 DPT=9102 SEQ=780876611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4CF1F0000000001030307) 
Feb 23 09:12:24 np0005626463.localdomain sshd[131734]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:12:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32707 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4D2060000000001030307) 
Feb 23 09:12:25 np0005626463.localdomain sshd[131734]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:12:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59994 DF PROTO=TCP SPT=57914 DPT=9102 SEQ=780876611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4DB460000000001030307) 
Feb 23 09:12:29 np0005626463.localdomain podman[131706]: 2026-02-23 09:12:21.579995949 +0000 UTC m=+0.042864176 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 09:12:29 np0005626463.localdomain sudo[131690]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:30 np0005626463.localdomain sudo[131912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzqyxvosgssswsosjhqxbsewimuvzrmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837950.1910074-888-281136832599133/AnsiballZ_podman_image.py
Feb 23 09:12:30 np0005626463.localdomain sudo[131912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:30 np0005626463.localdomain python3.9[131914]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12593 DF PROTO=TCP SPT=58046 DPT=9100 SEQ=385021854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4E9860000000001030307) 
Feb 23 09:12:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12594 DF PROTO=TCP SPT=58046 DPT=9100 SEQ=385021854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4F1860000000001030307) 
Feb 23 09:12:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=499 DF PROTO=TCP SPT=57266 DPT=9101 SEQ=434442640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4FE870000000001030307) 
Feb 23 09:12:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63204 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE50AD30000000001030307) 
Feb 23 09:12:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63206 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE516C60000000001030307) 
Feb 23 09:12:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14313 DF PROTO=TCP SPT=47896 DPT=9105 SEQ=588704374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE522060000000001030307) 
Feb 23 09:12:47 np0005626463.localdomain podman[131928]: 2026-02-23 09:12:30.822706318 +0000 UTC m=+0.048114221 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:12:47 np0005626463.localdomain sudo[131912]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:48 np0005626463.localdomain sudo[132609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajyilkjuuvmhgypiythbxmkndjcjhwoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837967.8118048-915-224461748913284/AnsiballZ_podman_image.py
Feb 23 09:12:48 np0005626463.localdomain sudo[132609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:48 np0005626463.localdomain python3.9[132611]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=501 DF PROTO=TCP SPT=57266 DPT=9101 SEQ=434442640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE52E060000000001030307) 
Feb 23 09:12:49 np0005626463.localdomain podman[132625]: 2026-02-23 09:12:48.414726034 +0000 UTC m=+0.031213131 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 23 09:12:50 np0005626463.localdomain sudo[132609]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:50 np0005626463.localdomain sudo[132783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aluqcewnlhrbbuyplsmbkjscqurqwrip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837970.6510184-942-204149643978725/AnsiballZ_podman_image.py
Feb 23 09:12:50 np0005626463.localdomain sudo[132783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:51 np0005626463.localdomain python3.9[132785]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:52 np0005626463.localdomain podman[132797]: 2026-02-23 09:12:51.257906628 +0000 UTC m=+0.044995313 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:12:52 np0005626463.localdomain sudo[132783]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:53 np0005626463.localdomain sudo[132962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrddeqtuhqhthfxjjzvqsqmndntovwft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837972.9320617-969-54067714720785/AnsiballZ_podman_image.py
Feb 23 09:12:53 np0005626463.localdomain sudo[132962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:53 np0005626463.localdomain python3.9[132964]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9095 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=4131939506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE544500000000001030307) 
Feb 23 09:12:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63208 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE546070000000001030307) 
Feb 23 09:12:57 np0005626463.localdomain podman[132977]: 2026-02-23 09:12:53.540990302 +0000 UTC m=+0.051730295 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 23 09:12:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9097 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=4131939506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE550460000000001030307) 
Feb 23 09:12:57 np0005626463.localdomain sudo[132962]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:58 np0005626463.localdomain sudo[133153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkewvewrqjyfgwkubfjhgsmhxkxgwibk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837977.6546843-969-134966065918311/AnsiballZ_podman_image.py
Feb 23 09:12:58 np0005626463.localdomain sudo[133153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:12:58 np0005626463.localdomain python3.9[133155]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 23 09:12:59 np0005626463.localdomain sudo[133180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:12:59 np0005626463.localdomain sudo[133180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:12:59 np0005626463.localdomain sudo[133180]: pam_unix(sudo:session): session closed for user root
Feb 23 09:12:59 np0005626463.localdomain sudo[133195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:12:59 np0005626463.localdomain sudo[133195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:12:59 np0005626463.localdomain podman[133167]: 2026-02-23 09:12:58.377557657 +0000 UTC m=+0.052666465 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 23 09:13:00 np0005626463.localdomain sudo[133195]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:00 np0005626463.localdomain sudo[133153]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:00 np0005626463.localdomain sudo[133337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:13:00 np0005626463.localdomain sudo[133337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:13:00 np0005626463.localdomain sudo[133337]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:00 np0005626463.localdomain sshd[128406]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:13:00 np0005626463.localdomain systemd-logind[759]: Session 39 logged out. Waiting for processes to exit.
Feb 23 09:13:00 np0005626463.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Feb 23 09:13:00 np0005626463.localdomain systemd[1]: session-39.scope: Consumed 2min 7.471s CPU time.
Feb 23 09:13:00 np0005626463.localdomain systemd-logind[759]: Removed session 39.
Feb 23 09:13:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37093 DF PROTO=TCP SPT=50740 DPT=9100 SEQ=1221982567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE55EC60000000001030307) 
Feb 23 09:13:01 np0005626463.localdomain sshd[133352]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:13:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37094 DF PROTO=TCP SPT=50740 DPT=9100 SEQ=1221982567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE566C70000000001030307) 
Feb 23 09:13:03 np0005626463.localdomain sshd[133352]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:13:04 np0005626463.localdomain sshd[133354]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:13:05 np0005626463.localdomain sshd[133354]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:13:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42667 DF PROTO=TCP SPT=53756 DPT=9101 SEQ=1338459337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE573C60000000001030307) 
Feb 23 09:13:06 np0005626463.localdomain sshd[133356]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:13:06 np0005626463.localdomain sshd[133356]: Accepted publickey for zuul from 192.168.122.30 port 49118 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:13:06 np0005626463.localdomain systemd-logind[759]: New session 40 of user zuul.
Feb 23 09:13:06 np0005626463.localdomain systemd[1]: Started Session 40 of User zuul.
Feb 23 09:13:06 np0005626463.localdomain sshd[133356]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:13:07 np0005626463.localdomain python3.9[133449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:13:08 np0005626463.localdomain sudo[133543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abpnpafthbmunhrskkpthzickwneoyts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837988.2220445-66-158122778358328/AnsiballZ_getent.py
Feb 23 09:13:08 np0005626463.localdomain sudo[133543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:09 np0005626463.localdomain python3.9[133546]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 23 09:13:09 np0005626463.localdomain sudo[133543]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12366 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE580050000000001030307) 
Feb 23 09:13:10 np0005626463.localdomain sudo[133637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozbyqgspzwszfugsymyiddvzxwbdaylp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837990.0285938-102-157682267014847/AnsiballZ_setup.py
Feb 23 09:13:10 np0005626463.localdomain sudo[133637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:10 np0005626463.localdomain python3.9[133639]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:13:10 np0005626463.localdomain sudo[133637]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:11 np0005626463.localdomain sudo[133691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjvpsxowecjgsjhtpgvflcdyjppkurpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837990.0285938-102-157682267014847/AnsiballZ_dnf.py
Feb 23 09:13:11 np0005626463.localdomain sudo[133691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:11 np0005626463.localdomain python3.9[133693]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:13:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12368 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE58C060000000001030307) 
Feb 23 09:13:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55167 DF PROTO=TCP SPT=54562 DPT=9105 SEQ=2723027043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE596060000000001030307) 
Feb 23 09:13:15 np0005626463.localdomain sudo[133691]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:16 np0005626463.localdomain sudo[133785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzvechthqqwqmlqscbdovxqbvwkzytvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771837996.5089602-144-151775845599245/AnsiballZ_dnf.py
Feb 23 09:13:16 np0005626463.localdomain sudo[133785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:16 np0005626463.localdomain python3.9[133787]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:13:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42669 DF PROTO=TCP SPT=53756 DPT=9101 SEQ=1338459337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5A4060000000001030307) 
Feb 23 09:13:21 np0005626463.localdomain sudo[133785]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:22 np0005626463.localdomain sudo[133973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyfooeodyzrwtnzdmciqmgminenqwhee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838001.6455412-168-74807955644117/AnsiballZ_systemd.py
Feb 23 09:13:22 np0005626463.localdomain sudo[133973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:22 np0005626463.localdomain python3.9[133975]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:13:23 np0005626463.localdomain sudo[133973]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7876 DF PROTO=TCP SPT=47194 DPT=9102 SEQ=4261620880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5B97F0000000001030307) 
Feb 23 09:13:24 np0005626463.localdomain python3.9[134068]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:13:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12370 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5BC070000000001030307) 
Feb 23 09:13:25 np0005626463.localdomain sudo[134158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmrcrcpjsqybxlujccaqtefwtbqvbkug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838004.8214393-225-98247116517247/AnsiballZ_sefcontext.py
Feb 23 09:13:25 np0005626463.localdomain sudo[134158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:25 np0005626463.localdomain python3.9[134160]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 23 09:13:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7878 DF PROTO=TCP SPT=47194 DPT=9102 SEQ=4261620880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5C5870000000001030307) 
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  Converting 2756 SID table entries...
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:13:27 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:13:27 np0005626463.localdomain sudo[134158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:28 np0005626463.localdomain python3.9[134466]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:13:29 np0005626463.localdomain sudo[134562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qernuxxdmivvchfivaoaaizarjournnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838009.0297883-279-83762646834549/AnsiballZ_dnf.py
Feb 23 09:13:29 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Feb 23 09:13:29 np0005626463.localdomain sudo[134562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:29 np0005626463.localdomain python3.9[134564]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:13:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54570 DF PROTO=TCP SPT=36690 DPT=9100 SEQ=2296867033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5D4060000000001030307) 
Feb 23 09:13:32 np0005626463.localdomain sudo[134562]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54571 DF PROTO=TCP SPT=36690 DPT=9100 SEQ=2296867033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5DC070000000001030307) 
Feb 23 09:13:33 np0005626463.localdomain sudo[134656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfxorqnbaoybolcpddmgcltbapemxplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838013.0971732-303-103086287873148/AnsiballZ_command.py
Feb 23 09:13:33 np0005626463.localdomain sudo[134656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:33 np0005626463.localdomain python3.9[134658]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:13:34 np0005626463.localdomain sudo[134656]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:35 np0005626463.localdomain sudo[134901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfhppbzxtfpcsomakcaebkygurkmeydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838014.6536236-327-223199017448325/AnsiballZ_file.py
Feb 23 09:13:35 np0005626463.localdomain sudo[134901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:35 np0005626463.localdomain python3.9[134903]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 23 09:13:35 np0005626463.localdomain sudo[134901]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:36 np0005626463.localdomain python3.9[134993]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:13:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11740 DF PROTO=TCP SPT=35688 DPT=9101 SEQ=736404783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5E8C60000000001030307) 
Feb 23 09:13:36 np0005626463.localdomain sudo[135085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olajqjafntvidampgdxmzorkldbngdmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838016.4584584-381-171578251315378/AnsiballZ_dnf.py
Feb 23 09:13:36 np0005626463.localdomain sudo[135085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:36 np0005626463.localdomain python3.9[135087]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:13:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41875 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5F5330000000001030307) 
Feb 23 09:13:40 np0005626463.localdomain sudo[135085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:40 np0005626463.localdomain sudo[135179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ialuefojoilfqogzfqkvgfctlrbgrzku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838020.6382844-405-111719392339333/AnsiballZ_dnf.py
Feb 23 09:13:40 np0005626463.localdomain sudo[135179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:41 np0005626463.localdomain python3.9[135181]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:13:41 np0005626463.localdomain sshd[135184]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:13:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41877 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE601470000000001030307) 
Feb 23 09:13:43 np0005626463.localdomain sshd[135184]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:13:44 np0005626463.localdomain sudo[135179]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:44 np0005626463.localdomain sshd[135200]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:13:44 np0005626463.localdomain sudo[135277]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpdwtpbcbpxckpxabcldtqlgvpocrymn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838024.608075-429-82974337911834/AnsiballZ_systemd.py
Feb 23 09:13:44 np0005626463.localdomain sudo[135277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:44 np0005626463.localdomain sshd[135200]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:13:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63096 DF PROTO=TCP SPT=49386 DPT=9105 SEQ=841001769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE60C060000000001030307) 
Feb 23 09:13:45 np0005626463.localdomain python3.9[135279]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 09:13:46 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:13:46 np0005626463.localdomain systemd-rc-local-generator[135305]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:13:46 np0005626463.localdomain systemd-sysv-generator[135308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:13:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:13:46 np0005626463.localdomain sudo[135277]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:47 np0005626463.localdomain sudo[135409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grvkekxdkqnioakysgzotaarikefawhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838026.9949126-459-172239100547064/AnsiballZ_stat.py
Feb 23 09:13:47 np0005626463.localdomain sudo[135409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:47 np0005626463.localdomain python3.9[135411]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:13:47 np0005626463.localdomain sudo[135409]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:48 np0005626463.localdomain sudo[135501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aogkqyiyvbbzatsymkxykfncrcdyvivb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838027.7786837-486-276777284758061/AnsiballZ_ini_file.py
Feb 23 09:13:48 np0005626463.localdomain sudo[135501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11742 DF PROTO=TCP SPT=35688 DPT=9101 SEQ=736404783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE618060000000001030307) 
Feb 23 09:13:48 np0005626463.localdomain python3.9[135503]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:48 np0005626463.localdomain sudo[135501]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:48 np0005626463.localdomain sudo[135595]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpnhqxnmntdsotzamffeegziolmoqexp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838028.637347-510-218124267858367/AnsiballZ_ini_file.py
Feb 23 09:13:48 np0005626463.localdomain sudo[135595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:49 np0005626463.localdomain python3.9[135597]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:49 np0005626463.localdomain sudo[135595]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:49 np0005626463.localdomain sudo[135687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynniitczyfjggulgftilzzxjdyqakesv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838029.298962-534-162120891238930/AnsiballZ_ini_file.py
Feb 23 09:13:49 np0005626463.localdomain sudo[135687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:49 np0005626463.localdomain python3.9[135689]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:49 np0005626463.localdomain sudo[135687]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:50 np0005626463.localdomain sudo[135779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxtrqjxotfxqgogqdiywyplmyyhbpuar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838030.0737226-564-200397121333463/AnsiballZ_stat.py
Feb 23 09:13:50 np0005626463.localdomain sudo[135779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:50 np0005626463.localdomain python3.9[135781]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:13:50 np0005626463.localdomain sudo[135779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:51 np0005626463.localdomain sudo[135852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcvtfgnmphbbqckjrqrxjnevjvixbjsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838030.0737226-564-200397121333463/AnsiballZ_copy.py
Feb 23 09:13:51 np0005626463.localdomain sudo[135852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:51 np0005626463.localdomain python3.9[135854]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838030.0737226-564-200397121333463/.source _original_basename=.rffe3863 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:51 np0005626463.localdomain sudo[135852]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:52 np0005626463.localdomain sudo[135944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyenhggkdfmfuysguznplfuweevpdgzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838031.8001516-609-2295018376626/AnsiballZ_file.py
Feb 23 09:13:52 np0005626463.localdomain sudo[135944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:52 np0005626463.localdomain python3.9[135946]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:52 np0005626463.localdomain sudo[135944]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:52 np0005626463.localdomain sudo[136036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wggmpcvglbcgpbkdafrqegjfbszubvbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838032.4534662-633-72653541386607/AnsiballZ_edpm_os_net_config_mappings.py
Feb 23 09:13:52 np0005626463.localdomain sudo[136036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:53 np0005626463.localdomain python3.9[136038]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 23 09:13:53 np0005626463.localdomain sudo[136036]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:53 np0005626463.localdomain sudo[136128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwyurnhevfjnsmhrkooeipmauardvzkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838033.2766123-660-139217122061321/AnsiballZ_file.py
Feb 23 09:13:53 np0005626463.localdomain sudo[136128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:53 np0005626463.localdomain python3.9[136130]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:53 np0005626463.localdomain sudo[136128]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36092 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE62EB00000000001030307) 
Feb 23 09:13:54 np0005626463.localdomain sudo[136220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zafwpdvcfcuddvrmnmkopjscvfklcpgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838034.1024196-690-273551418196324/AnsiballZ_stat.py
Feb 23 09:13:54 np0005626463.localdomain sudo[136220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:54 np0005626463.localdomain python3.9[136222]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:13:54 np0005626463.localdomain sudo[136220]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:54 np0005626463.localdomain sudo[136293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmdxxumzqyzjitycjmmnwkquwdpyswng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838034.1024196-690-273551418196324/AnsiballZ_copy.py
Feb 23 09:13:54 np0005626463.localdomain sudo[136293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41879 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE632060000000001030307) 
Feb 23 09:13:55 np0005626463.localdomain python3.9[136295]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838034.1024196-690-273551418196324/.source.yaml _original_basename=.iim40c_8 follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:13:55 np0005626463.localdomain sudo[136293]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:55 np0005626463.localdomain sudo[136385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpnkbdplzmwzafdwzflqspbwlwuozkne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838035.3730807-735-93985373502709/AnsiballZ_slurp.py
Feb 23 09:13:55 np0005626463.localdomain sudo[136385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:55 np0005626463.localdomain python3.9[136387]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 23 09:13:55 np0005626463.localdomain sudo[136385]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:57 np0005626463.localdomain sudo[136490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcbdbpyxfhxpkhagzvetzdclgicomkwx ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838036.3497324-762-23743892687750/async_wrapper.py j242000459938 300 /home/zuul/.ansible/tmp/ansible-tmp-1771838036.3497324-762-23743892687750/AnsiballZ_edpm_os_net_config.py _
Feb 23 09:13:57 np0005626463.localdomain sudo[136490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:13:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36094 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE63AC60000000001030307) 
Feb 23 09:13:57 np0005626463.localdomain ansible-async_wrapper.py[136492]: Invoked with j242000459938 300 /home/zuul/.ansible/tmp/ansible-tmp-1771838036.3497324-762-23743892687750/AnsiballZ_edpm_os_net_config.py _
Feb 23 09:13:57 np0005626463.localdomain ansible-async_wrapper.py[136495]: Starting module and watcher
Feb 23 09:13:57 np0005626463.localdomain ansible-async_wrapper.py[136495]: Start watching 136496 (300)
Feb 23 09:13:57 np0005626463.localdomain ansible-async_wrapper.py[136496]: Start module (136496)
Feb 23 09:13:57 np0005626463.localdomain ansible-async_wrapper.py[136492]: Return async_wrapper task started.
Feb 23 09:13:57 np0005626463.localdomain sudo[136490]: pam_unix(sudo:session): session closed for user root
Feb 23 09:13:57 np0005626463.localdomain python3.9[136497]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=False purge_provider=
Feb 23 09:13:58 np0005626463.localdomain ansible-async_wrapper.py[136496]: Module complete (136496)
Feb 23 09:14:00 np0005626463.localdomain sudo[136512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:14:00 np0005626463.localdomain sudo[136512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:14:00 np0005626463.localdomain sudo[136512]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9324 DF PROTO=TCP SPT=40138 DPT=9100 SEQ=2835335867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE649460000000001030307) 
Feb 23 09:14:01 np0005626463.localdomain sudo[136552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:14:01 np0005626463.localdomain sudo[136552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:14:02 np0005626463.localdomain ansible-async_wrapper.py[136495]: Done in kid B.
Feb 23 09:14:02 np0005626463.localdomain podman[136658]: 2026-02-23 09:14:02.232297075 +0000 UTC m=+0.083676888 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:14:02 np0005626463.localdomain podman[136658]: 2026-02-23 09:14:02.328560466 +0000 UTC m=+0.179940289 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Feb 23 09:14:02 np0005626463.localdomain sudo[136751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdoonxmsqpuzaweaqtbdexiawzjlyrei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838041.4308765-762-208609872787106/AnsiballZ_async_status.py
Feb 23 09:14:02 np0005626463.localdomain sudo[136751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:02 np0005626463.localdomain sudo[136552]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:02 np0005626463.localdomain python3.9[136762]: ansible-ansible.legacy.async_status Invoked with jid=j242000459938.136492 mode=status _async_dir=/root/.ansible_async
Feb 23 09:14:02 np0005626463.localdomain sudo[136751]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:02 np0005626463.localdomain sudo[136769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:14:02 np0005626463.localdomain sudo[136769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:14:02 np0005626463.localdomain sudo[136769]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:02 np0005626463.localdomain sudo[136810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:14:02 np0005626463.localdomain sudo[136810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:14:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9325 DF PROTO=TCP SPT=40138 DPT=9100 SEQ=2835335867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE651460000000001030307) 
Feb 23 09:14:03 np0005626463.localdomain sudo[136855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onjxcylcyeliolruvefcxdpyizrhseni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838041.4308765-762-208609872787106/AnsiballZ_async_status.py
Feb 23 09:14:03 np0005626463.localdomain sudo[136855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:03 np0005626463.localdomain python3.9[136857]: ansible-ansible.legacy.async_status Invoked with jid=j242000459938.136492 mode=cleanup _async_dir=/root/.ansible_async
Feb 23 09:14:03 np0005626463.localdomain sudo[136855]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:03 np0005626463.localdomain sudo[136810]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:03 np0005626463.localdomain sudo[136978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eowsypompqtajwgdimndxeaeqkniazsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838043.4010186-828-258624170803688/AnsiballZ_stat.py
Feb 23 09:14:03 np0005626463.localdomain sudo[136978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:03 np0005626463.localdomain python3.9[136980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:03 np0005626463.localdomain sudo[136978]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:03 np0005626463.localdomain sudo[136981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:14:03 np0005626463.localdomain sudo[136981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:14:03 np0005626463.localdomain sudo[136981]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:04 np0005626463.localdomain sudo[137066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioktmuvzhhlnkvvebjnpfmpidltzeicy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838043.4010186-828-258624170803688/AnsiballZ_copy.py
Feb 23 09:14:04 np0005626463.localdomain sudo[137066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:04 np0005626463.localdomain python3.9[137068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838043.4010186-828-258624170803688/.source.returncode _original_basename=.8p_ff5od follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:04 np0005626463.localdomain sudo[137066]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:04 np0005626463.localdomain sudo[137158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szwtrfscpuuxujarbwflhysshqjhoplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838044.6354992-876-64034903805338/AnsiballZ_stat.py
Feb 23 09:14:04 np0005626463.localdomain sudo[137158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:05 np0005626463.localdomain python3.9[137160]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:05 np0005626463.localdomain sudo[137158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:05 np0005626463.localdomain sudo[137231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avgmotnalhdlgpsrkfcgqtbxqzymuzzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838044.6354992-876-64034903805338/AnsiballZ_copy.py
Feb 23 09:14:05 np0005626463.localdomain sudo[137231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:05 np0005626463.localdomain python3.9[137233]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838044.6354992-876-64034903805338/.source.cfg _original_basename=.xgh9r2sy follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:05 np0005626463.localdomain sudo[137231]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:06 np0005626463.localdomain sudo[137323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekkvrusrdpvtjilxbamccdtcchqyvcmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838045.8724082-921-248712246169765/AnsiballZ_systemd.py
Feb 23 09:14:06 np0005626463.localdomain sudo[137323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19951 DF PROTO=TCP SPT=41204 DPT=9101 SEQ=1234346066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE65E070000000001030307) 
Feb 23 09:14:06 np0005626463.localdomain python3.9[137325]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:14:06 np0005626463.localdomain systemd[1]: Reloading Network Manager...
Feb 23 09:14:06 np0005626463.localdomain NetworkManager[5974]: <info>  [1771838046.5329] audit: op="reload" arg="0" pid=137329 uid=0 result="success"
Feb 23 09:14:06 np0005626463.localdomain NetworkManager[5974]: <info>  [1771838046.5337] config: signal: SIGHUP (no changes from disk)
Feb 23 09:14:06 np0005626463.localdomain systemd[1]: Reloaded Network Manager.
Feb 23 09:14:06 np0005626463.localdomain sudo[137323]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:07 np0005626463.localdomain sshd[133356]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:14:07 np0005626463.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Feb 23 09:14:07 np0005626463.localdomain systemd[1]: session-40.scope: Consumed 38.299s CPU time.
Feb 23 09:14:07 np0005626463.localdomain systemd-logind[759]: Session 40 logged out. Waiting for processes to exit.
Feb 23 09:14:07 np0005626463.localdomain systemd-logind[759]: Removed session 40.
Feb 23 09:14:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36096 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE66A060000000001030307) 
Feb 23 09:14:11 np0005626463.localdomain sshd[137344]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:14:12 np0005626463.localdomain sshd[137344]: Accepted publickey for zuul from 192.168.122.30 port 57346 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:14:12 np0005626463.localdomain systemd-logind[759]: New session 41 of user zuul.
Feb 23 09:14:12 np0005626463.localdomain systemd[1]: Started Session 41 of User zuul.
Feb 23 09:14:12 np0005626463.localdomain sshd[137344]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:14:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1061 DF PROTO=TCP SPT=49698 DPT=9882 SEQ=1864574002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE676860000000001030307) 
Feb 23 09:14:13 np0005626463.localdomain python3.9[137437]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:14:14 np0005626463.localdomain python3.9[137531]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:14:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33838 DF PROTO=TCP SPT=38584 DPT=9105 SEQ=3783356907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE680060000000001030307) 
Feb 23 09:14:15 np0005626463.localdomain python3.9[137684]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:14:16 np0005626463.localdomain sshd[137344]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:14:16 np0005626463.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Feb 23 09:14:16 np0005626463.localdomain systemd[1]: session-41.scope: Consumed 2.264s CPU time.
Feb 23 09:14:16 np0005626463.localdomain systemd-logind[759]: Session 41 logged out. Waiting for processes to exit.
Feb 23 09:14:16 np0005626463.localdomain systemd-logind[759]: Removed session 41.
Feb 23 09:14:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19953 DF PROTO=TCP SPT=41204 DPT=9101 SEQ=1234346066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE68E070000000001030307) 
Feb 23 09:14:20 np0005626463.localdomain sshd[137700]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:14:21 np0005626463.localdomain sshd[137700]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:14:21 np0005626463.localdomain sshd[137702]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:14:21 np0005626463.localdomain sshd[137702]: Accepted publickey for zuul from 192.168.122.30 port 57354 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:14:21 np0005626463.localdomain systemd-logind[759]: New session 42 of user zuul.
Feb 23 09:14:21 np0005626463.localdomain systemd[1]: Started Session 42 of User zuul.
Feb 23 09:14:21 np0005626463.localdomain sshd[137702]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:14:22 np0005626463.localdomain python3.9[137795]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:14:23 np0005626463.localdomain python3.9[137889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:14:23 np0005626463.localdomain sudo[137983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wunxacjxhwqzeswlisboranighjinzxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838063.6916342-75-17621373903481/AnsiballZ_setup.py
Feb 23 09:14:24 np0005626463.localdomain sudo[137983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41070 DF PROTO=TCP SPT=55522 DPT=9102 SEQ=3278640626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6A3E00000000001030307) 
Feb 23 09:14:24 np0005626463.localdomain python3.9[137985]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:14:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1063 DF PROTO=TCP SPT=49698 DPT=9882 SEQ=1864574002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6A6060000000001030307) 
Feb 23 09:14:24 np0005626463.localdomain sudo[137983]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:24 np0005626463.localdomain sshd[137994]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:14:24 np0005626463.localdomain sudo[138039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyqdefdssxuoqhauzetjolvrtvopioeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838063.6916342-75-17621373903481/AnsiballZ_dnf.py
Feb 23 09:14:24 np0005626463.localdomain sudo[138039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:25 np0005626463.localdomain sshd[137994]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:14:25 np0005626463.localdomain python3.9[138041]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:14:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41072 DF PROTO=TCP SPT=55522 DPT=9102 SEQ=3278640626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6B0070000000001030307) 
Feb 23 09:14:28 np0005626463.localdomain sudo[138039]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:28 np0005626463.localdomain sudo[138133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfrcsafmbiwbedipijijvevssqxlwmck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838068.6369042-111-240466708423848/AnsiballZ_setup.py
Feb 23 09:14:28 np0005626463.localdomain sudo[138133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:29 np0005626463.localdomain python3.9[138135]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:14:29 np0005626463.localdomain sudo[138133]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:30 np0005626463.localdomain sudo[138288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsygqkndqobfmbrxlezatbhqhqafklrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838069.9206643-144-195819717196460/AnsiballZ_file.py
Feb 23 09:14:30 np0005626463.localdomain sudo[138288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:30 np0005626463.localdomain python3.9[138290]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:30 np0005626463.localdomain sudo[138288]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13771 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=4080666825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6BE460000000001030307) 
Feb 23 09:14:31 np0005626463.localdomain sudo[138380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbaelewosqavmrayqummoglshupqrken ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838070.7375212-168-121022466762537/AnsiballZ_command.py
Feb 23 09:14:31 np0005626463.localdomain sudo[138380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:31 np0005626463.localdomain python3.9[138382]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:14:31 np0005626463.localdomain sudo[138380]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:32 np0005626463.localdomain sudo[138484]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lndcpnrguekxdkjxbojyxibjhzofmpon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838071.6565552-192-192172080528793/AnsiballZ_stat.py
Feb 23 09:14:32 np0005626463.localdomain sudo[138484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:32 np0005626463.localdomain python3.9[138486]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:32 np0005626463.localdomain sudo[138484]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:32 np0005626463.localdomain sudo[138532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccfiynkkpklocgafddqwuoerrzdxaimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838071.6565552-192-192172080528793/AnsiballZ_file.py
Feb 23 09:14:32 np0005626463.localdomain sudo[138532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:32 np0005626463.localdomain python3.9[138534]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:32 np0005626463.localdomain sudo[138532]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13772 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=4080666825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6C6460000000001030307) 
Feb 23 09:14:33 np0005626463.localdomain sudo[138624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mloamdpahhrzlckbjtffscyaaimgbjoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838072.89426-228-162723490204897/AnsiballZ_stat.py
Feb 23 09:14:33 np0005626463.localdomain sudo[138624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:33 np0005626463.localdomain python3.9[138626]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:33 np0005626463.localdomain sudo[138624]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:33 np0005626463.localdomain sudo[138672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvghakowthhzwwwcoigctkhforgstzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838072.89426-228-162723490204897/AnsiballZ_file.py
Feb 23 09:14:33 np0005626463.localdomain sudo[138672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:33 np0005626463.localdomain python3.9[138674]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:14:33 np0005626463.localdomain sudo[138672]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:34 np0005626463.localdomain sudo[138764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvduchxbvkdmrommlheipgudtidgjidw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838074.129476-267-84879595328090/AnsiballZ_ini_file.py
Feb 23 09:14:34 np0005626463.localdomain sudo[138764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:34 np0005626463.localdomain python3.9[138766]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:14:34 np0005626463.localdomain sudo[138764]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:35 np0005626463.localdomain sudo[138856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpgbmfbuxiirlmxvgfgwqisszqfcdvbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838074.8978684-267-239353173476299/AnsiballZ_ini_file.py
Feb 23 09:14:35 np0005626463.localdomain sudo[138856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:35 np0005626463.localdomain python3.9[138858]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:14:35 np0005626463.localdomain sudo[138856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:35 np0005626463.localdomain sudo[138948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuelmjpvgmiwrfygicauwngfgeykdffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838075.4666982-267-157016356004856/AnsiballZ_ini_file.py
Feb 23 09:14:35 np0005626463.localdomain sudo[138948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:35 np0005626463.localdomain python3.9[138950]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:14:35 np0005626463.localdomain sudo[138948]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39239 DF PROTO=TCP SPT=37120 DPT=9101 SEQ=249785914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6D3460000000001030307) 
Feb 23 09:14:36 np0005626463.localdomain sudo[139040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsqxymzsmggsttutcqicvmpzclkwujeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838076.06771-267-112684477708889/AnsiballZ_ini_file.py
Feb 23 09:14:36 np0005626463.localdomain sudo[139040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:36 np0005626463.localdomain python3.9[139042]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:14:36 np0005626463.localdomain sudo[139040]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:37 np0005626463.localdomain sudo[139132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuuutunxhngwgyllaegpldqctuyrwbdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838076.7884722-360-12048955512852/AnsiballZ_dnf.py
Feb 23 09:14:37 np0005626463.localdomain sudo[139132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:37 np0005626463.localdomain python3.9[139134]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:14:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64432 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6DF930000000001030307) 
Feb 23 09:14:40 np0005626463.localdomain sudo[139132]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:41 np0005626463.localdomain sudo[139226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjueacqmlyvyovystgeldaulinsmbhzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838080.9940364-393-278899996040951/AnsiballZ_setup.py
Feb 23 09:14:41 np0005626463.localdomain sudo[139226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:41 np0005626463.localdomain python3.9[139228]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:14:41 np0005626463.localdomain sudo[139226]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:42 np0005626463.localdomain sudo[139320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gubfjzbnxwekggjeifgsuvvoxzsgcckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838081.756054-417-74941733512571/AnsiballZ_stat.py
Feb 23 09:14:42 np0005626463.localdomain sudo[139320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:42 np0005626463.localdomain python3.9[139322]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:14:42 np0005626463.localdomain sudo[139320]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64434 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6EB870000000001030307) 
Feb 23 09:14:42 np0005626463.localdomain sudo[139412]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfdpatyrlfwxuxssvagjzcuobnzthvol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838082.509287-444-6588739487044/AnsiballZ_stat.py
Feb 23 09:14:42 np0005626463.localdomain sudo[139412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:43 np0005626463.localdomain python3.9[139414]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:14:43 np0005626463.localdomain sudo[139412]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:43 np0005626463.localdomain sudo[139504]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwhtfopgwrjhifcoorftdbbsaxwcubmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838083.3232887-474-92327295665004/AnsiballZ_command.py
Feb 23 09:14:43 np0005626463.localdomain sudo[139504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:43 np0005626463.localdomain python3.9[139506]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:14:43 np0005626463.localdomain sudo[139504]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:44 np0005626463.localdomain sudo[139597]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asnxvpvutywafikvppavdmyimkqwaedp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838084.0874777-504-114183084294659/AnsiballZ_service_facts.py
Feb 23 09:14:44 np0005626463.localdomain sudo[139597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:44 np0005626463.localdomain python3.9[139599]: ansible-service_facts Invoked
Feb 23 09:14:44 np0005626463.localdomain network[139616]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:14:44 np0005626463.localdomain network[139617]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:14:44 np0005626463.localdomain network[139618]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:14:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58914 DF PROTO=TCP SPT=58300 DPT=9105 SEQ=3528398000 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6F6060000000001030307) 
Feb 23 09:14:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:14:46 np0005626463.localdomain sshd[139697]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:14:47 np0005626463.localdomain sudo[139597]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:48 np0005626463.localdomain sshd[139697]: Invalid user admin from 80.94.95.116 port 26950
Feb 23 09:14:48 np0005626463.localdomain sshd[139697]: Connection closed by invalid user admin 80.94.95.116 port 26950 [preauth]
Feb 23 09:14:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39241 DF PROTO=TCP SPT=37120 DPT=9101 SEQ=249785914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE704070000000001030307) 
Feb 23 09:14:49 np0005626463.localdomain sudo[139832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceytfoaykuclugzxyunjfxxjrcirnqly ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771838089.143044-549-256076533129774/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771838089.143044-549-256076533129774/args
Feb 23 09:14:49 np0005626463.localdomain sudo[139832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:49 np0005626463.localdomain sudo[139832]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:50 np0005626463.localdomain sudo[139939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxgwpkwodhzhcsngeqcibdannhamrmjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838089.8381774-582-108987861242214/AnsiballZ_dnf.py
Feb 23 09:14:50 np0005626463.localdomain sudo[139939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:50 np0005626463.localdomain python3.9[139941]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:14:53 np0005626463.localdomain sudo[139939]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43892 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7190F0000000001030307) 
Feb 23 09:14:54 np0005626463.localdomain sudo[140033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldvvaeyqpcjwcilirssljijbgyaqmjed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838094.1735332-621-155781744388421/AnsiballZ_package_facts.py
Feb 23 09:14:54 np0005626463.localdomain sudo[140033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:55 np0005626463.localdomain python3.9[140035]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 23 09:14:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64436 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE71CF90000000001030307) 
Feb 23 09:14:55 np0005626463.localdomain sudo[140033]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:56 np0005626463.localdomain sudo[140125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfegefpuftqfqotacdffzueuqpiyyozq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838096.005159-651-221120623885301/AnsiballZ_stat.py
Feb 23 09:14:56 np0005626463.localdomain sudo[140125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:56 np0005626463.localdomain python3.9[140127]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:56 np0005626463.localdomain sudo[140125]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:57 np0005626463.localdomain sudo[140200]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdxnpvrnndeuhtbquoupgouoqnvyzbul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838096.005159-651-221120623885301/AnsiballZ_copy.py
Feb 23 09:14:57 np0005626463.localdomain sudo[140200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43894 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE725070000000001030307) 
Feb 23 09:14:57 np0005626463.localdomain python3.9[140202]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838096.005159-651-221120623885301/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:57 np0005626463.localdomain sudo[140200]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:57 np0005626463.localdomain sudo[140294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fheurswxhqjnkuurkqrenhcuftckkpmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838097.5552204-696-107666057157854/AnsiballZ_stat.py
Feb 23 09:14:57 np0005626463.localdomain sudo[140294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:58 np0005626463.localdomain python3.9[140296]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:14:58 np0005626463.localdomain sudo[140294]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:58 np0005626463.localdomain sudo[140369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrehsxordefqncvkohftpusidiykcomq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838097.5552204-696-107666057157854/AnsiballZ_copy.py
Feb 23 09:14:58 np0005626463.localdomain sudo[140369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:14:58 np0005626463.localdomain python3.9[140371]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838097.5552204-696-107666057157854/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:14:58 np0005626463.localdomain sudo[140369]: pam_unix(sudo:session): session closed for user root
Feb 23 09:14:59 np0005626463.localdomain sudo[140463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtkpgzrnmaheubgssqvzcpfnykbwyqnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838099.548676-759-156086611071181/AnsiballZ_lineinfile.py
Feb 23 09:14:59 np0005626463.localdomain sudo[140463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:00 np0005626463.localdomain python3.9[140465]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:00 np0005626463.localdomain sudo[140463]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19694 DF PROTO=TCP SPT=38358 DPT=9100 SEQ=2257578880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE733860000000001030307) 
Feb 23 09:15:01 np0005626463.localdomain sshd[140517]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:01 np0005626463.localdomain sudo[140558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpfqpkqhptbmcolgzwimtrganklnmqba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838101.2963946-804-234703528855387/AnsiballZ_setup.py
Feb 23 09:15:01 np0005626463.localdomain sudo[140558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:01 np0005626463.localdomain python3.9[140560]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:15:02 np0005626463.localdomain sudo[140558]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:02 np0005626463.localdomain sshd[140517]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:15:02 np0005626463.localdomain sudo[140613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myhafmgfodbanedhvibmiesfmqmvksxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838101.2963946-804-234703528855387/AnsiballZ_systemd.py
Feb 23 09:15:02 np0005626463.localdomain sudo[140613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19695 DF PROTO=TCP SPT=38358 DPT=9100 SEQ=2257578880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE73B870000000001030307) 
Feb 23 09:15:03 np0005626463.localdomain python3.9[140615]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:15:04 np0005626463.localdomain sudo[140618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:15:04 np0005626463.localdomain sudo[140618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:15:04 np0005626463.localdomain sudo[140618]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:04 np0005626463.localdomain sudo[140613]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:04 np0005626463.localdomain sudo[140633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:15:04 np0005626463.localdomain sudo[140633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:15:04 np0005626463.localdomain sshd[140673]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:04 np0005626463.localdomain sudo[140633]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:05 np0005626463.localdomain sshd[140673]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:15:05 np0005626463.localdomain sudo[140771]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dixdozpcfwllvwybmgsonedhsfbcgmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838105.100869-852-209644876699907/AnsiballZ_setup.py
Feb 23 09:15:05 np0005626463.localdomain sudo[140771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:05 np0005626463.localdomain sudo[140773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:15:05 np0005626463.localdomain sudo[140773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:15:05 np0005626463.localdomain sudo[140773]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:05 np0005626463.localdomain python3.9[140781]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:15:06 np0005626463.localdomain sudo[140771]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25763 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE748860000000001030307) 
Feb 23 09:15:06 np0005626463.localdomain sudo[140840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idlkaylqbamkiatdtloaetlebkvvujry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838105.100869-852-209644876699907/AnsiballZ_systemd.py
Feb 23 09:15:06 np0005626463.localdomain sudo[140840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:06 np0005626463.localdomain python3.9[140842]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:15:06 np0005626463.localdomain chronyd[25974]: chronyd exiting
Feb 23 09:15:06 np0005626463.localdomain systemd[1]: Stopping NTP client/server...
Feb 23 09:15:06 np0005626463.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 23 09:15:06 np0005626463.localdomain systemd[1]: Stopped NTP client/server.
Feb 23 09:15:06 np0005626463.localdomain systemd[1]: Starting NTP client/server...
Feb 23 09:15:06 np0005626463.localdomain chronyd[140850]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 23 09:15:06 np0005626463.localdomain chronyd[140850]: Frequency -30.764 +/- 0.375 ppm read from /var/lib/chrony/drift
Feb 23 09:15:06 np0005626463.localdomain chronyd[140850]: Loaded seccomp filter (level 2)
Feb 23 09:15:06 np0005626463.localdomain systemd[1]: Started NTP client/server.
Feb 23 09:15:06 np0005626463.localdomain sudo[140840]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:07 np0005626463.localdomain sshd[137702]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:15:07 np0005626463.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Feb 23 09:15:07 np0005626463.localdomain systemd[1]: session-42.scope: Consumed 29.434s CPU time.
Feb 23 09:15:07 np0005626463.localdomain systemd-logind[759]: Session 42 logged out. Waiting for processes to exit.
Feb 23 09:15:07 np0005626463.localdomain systemd-logind[759]: Removed session 42.
Feb 23 09:15:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47709 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE754C30000000001030307) 
Feb 23 09:15:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47711 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE760C60000000001030307) 
Feb 23 09:15:12 np0005626463.localdomain sshd[140866]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:12 np0005626463.localdomain sshd[140866]: Accepted publickey for zuul from 192.168.122.30 port 56436 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:15:12 np0005626463.localdomain systemd-logind[759]: New session 43 of user zuul.
Feb 23 09:15:12 np0005626463.localdomain systemd[1]: Started Session 43 of User zuul.
Feb 23 09:15:12 np0005626463.localdomain sshd[140866]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:15:13 np0005626463.localdomain python3.9[140959]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:15:14 np0005626463.localdomain sudo[141053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzudwvqtnkmuxieqpunxszozfwagujra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838114.406089-54-195371158053273/AnsiballZ_file.py
Feb 23 09:15:14 np0005626463.localdomain sudo[141053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:15 np0005626463.localdomain python3.9[141055]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:15 np0005626463.localdomain sudo[141053]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13448 DF PROTO=TCP SPT=57840 DPT=9105 SEQ=1310932017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE76C060000000001030307) 
Feb 23 09:15:15 np0005626463.localdomain sudo[141158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jynedswlaurfolufgraistfxitbcalbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838115.1753538-78-167313571206245/AnsiballZ_stat.py
Feb 23 09:15:15 np0005626463.localdomain sudo[141158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:15 np0005626463.localdomain python3.9[141160]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:15 np0005626463.localdomain sudo[141158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:16 np0005626463.localdomain sudo[141206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogmxzrmmxzojtdrvjmsqhjiqvuaabbht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838115.1753538-78-167313571206245/AnsiballZ_file.py
Feb 23 09:15:16 np0005626463.localdomain sudo[141206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:16 np0005626463.localdomain python3.9[141208]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.bwt_xnhl recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:16 np0005626463.localdomain sudo[141206]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:17 np0005626463.localdomain sudo[141298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-touirazhbyikppffcbmlexconuckakzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838116.8482292-138-210665163821427/AnsiballZ_stat.py
Feb 23 09:15:17 np0005626463.localdomain sudo[141298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:17 np0005626463.localdomain python3.9[141300]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:17 np0005626463.localdomain sudo[141298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:17 np0005626463.localdomain sudo[141373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdxyzfkipmwwykfjdprfozcyoucuuhfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838116.8482292-138-210665163821427/AnsiballZ_copy.py
Feb 23 09:15:17 np0005626463.localdomain sudo[141373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:18 np0005626463.localdomain python3.9[141375]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838116.8482292-138-210665163821427/.source _original_basename=.n39d2bru follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:18 np0005626463.localdomain sudo[141373]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25765 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE778060000000001030307) 
Feb 23 09:15:18 np0005626463.localdomain sudo[141465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccnxztymjmsmcprcddvwhrbjbkwrzofc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838118.253512-186-136735103403441/AnsiballZ_file.py
Feb 23 09:15:18 np0005626463.localdomain sudo[141465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:18 np0005626463.localdomain python3.9[141467]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:15:18 np0005626463.localdomain sudo[141465]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:19 np0005626463.localdomain sudo[141557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gujcqbmzazofraxuvfmdcxznrtcrizid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838118.8802204-210-248966440261134/AnsiballZ_stat.py
Feb 23 09:15:19 np0005626463.localdomain sudo[141557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:19 np0005626463.localdomain python3.9[141559]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:19 np0005626463.localdomain sudo[141557]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:19 np0005626463.localdomain sudo[141630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddtsvvruyhbzmybnpwkolfhdskjueinn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838118.8802204-210-248966440261134/AnsiballZ_copy.py
Feb 23 09:15:19 np0005626463.localdomain sudo[141630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:19 np0005626463.localdomain python3.9[141632]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838118.8802204-210-248966440261134/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:15:19 np0005626463.localdomain sudo[141630]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:20 np0005626463.localdomain sudo[141722]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxbzfnjweozscaukeveuacjgxvvguxii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838120.0959597-210-252500718927505/AnsiballZ_stat.py
Feb 23 09:15:20 np0005626463.localdomain sudo[141722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:20 np0005626463.localdomain python3.9[141724]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:20 np0005626463.localdomain sudo[141722]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:20 np0005626463.localdomain sudo[141795]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhmeompjoeetfuupmdklpwizdalwgqhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838120.0959597-210-252500718927505/AnsiballZ_copy.py
Feb 23 09:15:20 np0005626463.localdomain sudo[141795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:21 np0005626463.localdomain python3.9[141797]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838120.0959597-210-252500718927505/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:15:21 np0005626463.localdomain sudo[141795]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:21 np0005626463.localdomain sudo[141887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncoknzgyyazxqtetyknwfjjlhxagdulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838121.3064077-297-151792968644079/AnsiballZ_file.py
Feb 23 09:15:21 np0005626463.localdomain sudo[141887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:21 np0005626463.localdomain python3.9[141889]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:21 np0005626463.localdomain sudo[141887]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:22 np0005626463.localdomain sudo[141979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqzbkxmcqaqoazbmmvrmakhyruslsspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838121.9389346-321-262015738322171/AnsiballZ_stat.py
Feb 23 09:15:22 np0005626463.localdomain sudo[141979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:22 np0005626463.localdomain python3.9[141981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:22 np0005626463.localdomain sudo[141979]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:22 np0005626463.localdomain sudo[142052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yedmzuvurgctyfqnsrpsqiqdheyxqfrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838121.9389346-321-262015738322171/AnsiballZ_copy.py
Feb 23 09:15:22 np0005626463.localdomain sudo[142052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:23 np0005626463.localdomain python3.9[142054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838121.9389346-321-262015738322171/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:23 np0005626463.localdomain sudo[142052]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:23 np0005626463.localdomain sudo[142144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtljzixwzmteiysosoociuladabcfqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838123.1881382-366-34374967328336/AnsiballZ_stat.py
Feb 23 09:15:23 np0005626463.localdomain sudo[142144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:23 np0005626463.localdomain python3.9[142146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:23 np0005626463.localdomain sudo[142144]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:24 np0005626463.localdomain sudo[142217]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luwxgacbjcjxxdettssmyoappwssvkba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838123.1881382-366-34374967328336/AnsiballZ_copy.py
Feb 23 09:15:24 np0005626463.localdomain sudo[142217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15057 DF PROTO=TCP SPT=44814 DPT=9102 SEQ=461662555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE78E400000000001030307) 
Feb 23 09:15:24 np0005626463.localdomain python3.9[142219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838123.1881382-366-34374967328336/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:24 np0005626463.localdomain sudo[142217]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47713 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE790070000000001030307) 
Feb 23 09:15:25 np0005626463.localdomain sudo[142309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-filoqgdeoktwbtzcxejvuxnrjdkdloqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838124.3892546-411-162026080579480/AnsiballZ_systemd.py
Feb 23 09:15:25 np0005626463.localdomain sudo[142309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:25 np0005626463.localdomain python3.9[142311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:15:25 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:15:25 np0005626463.localdomain systemd-sysv-generator[142334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:15:25 np0005626463.localdomain systemd-rc-local-generator[142328]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:15:25 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:15:25 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:15:25 np0005626463.localdomain systemd-rc-local-generator[142376]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:15:25 np0005626463.localdomain systemd-sysv-generator[142379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:15:25 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:15:26 np0005626463.localdomain systemd[1]: Starting EDPM Container Shutdown...
Feb 23 09:15:26 np0005626463.localdomain systemd[1]: Finished EDPM Container Shutdown.
Feb 23 09:15:26 np0005626463.localdomain sudo[142309]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:26 np0005626463.localdomain sudo[142477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dodienojjmgjdvqqbpkwglgdapccoqlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838126.2247381-435-227566872375162/AnsiballZ_stat.py
Feb 23 09:15:26 np0005626463.localdomain sudo[142477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:26 np0005626463.localdomain python3.9[142479]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:26 np0005626463.localdomain sudo[142477]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:27 np0005626463.localdomain sudo[142550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amttrmosrijubmsjlphsgcjbdlbcuflc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838126.2247381-435-227566872375162/AnsiballZ_copy.py
Feb 23 09:15:27 np0005626463.localdomain sudo[142550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15059 DF PROTO=TCP SPT=44814 DPT=9102 SEQ=461662555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE79A460000000001030307) 
Feb 23 09:15:27 np0005626463.localdomain python3.9[142552]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838126.2247381-435-227566872375162/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:27 np0005626463.localdomain sudo[142550]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:27 np0005626463.localdomain sudo[142642]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zasfdnfiebjibgffrncewatzgisiyfjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838127.4125059-480-110735526367212/AnsiballZ_stat.py
Feb 23 09:15:27 np0005626463.localdomain sudo[142642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:27 np0005626463.localdomain python3.9[142644]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:27 np0005626463.localdomain sudo[142642]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:28 np0005626463.localdomain sudo[142715]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxjkpgcvwautjpbmsykohmzmqmvnehfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838127.4125059-480-110735526367212/AnsiballZ_copy.py
Feb 23 09:15:28 np0005626463.localdomain sudo[142715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:28 np0005626463.localdomain python3.9[142717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838127.4125059-480-110735526367212/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:28 np0005626463.localdomain sudo[142715]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:29 np0005626463.localdomain sudo[142807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwhbufemcynpyzfuwryeoqizjatxsjpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838128.752381-525-153140051072432/AnsiballZ_systemd.py
Feb 23 09:15:29 np0005626463.localdomain sudo[142807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:29 np0005626463.localdomain python3.9[142809]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:15:29 np0005626463.localdomain systemd-rc-local-generator[142829]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:15:29 np0005626463.localdomain systemd-sysv-generator[142834]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 09:15:29 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 09:15:29 np0005626463.localdomain sudo[142807]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:30 np0005626463.localdomain python3.9[142940]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:15:30 np0005626463.localdomain network[142957]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:15:30 np0005626463.localdomain network[142958]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:15:30 np0005626463.localdomain network[142959]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:15:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36146 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7A8C70000000001030307) 
Feb 23 09:15:32 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:15:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36147 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7B0C60000000001030307) 
Feb 23 09:15:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1403 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=901113687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7BD860000000001030307) 
Feb 23 09:15:36 np0005626463.localdomain sudo[143158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zawjviwefxdkzhwnwlrmrurpzxsokdbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838136.183265-603-107077308182534/AnsiballZ_stat.py
Feb 23 09:15:36 np0005626463.localdomain sudo[143158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:36 np0005626463.localdomain python3.9[143160]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:36 np0005626463.localdomain sudo[143158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:37 np0005626463.localdomain sudo[143233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akcbrgjczryzennrbarpcjifqabwtwyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838136.183265-603-107077308182534/AnsiballZ_copy.py
Feb 23 09:15:37 np0005626463.localdomain sudo[143233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:37 np0005626463.localdomain python3.9[143235]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838136.183265-603-107077308182534/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:37 np0005626463.localdomain sudo[143233]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:37 np0005626463.localdomain sshd[143259]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:37 np0005626463.localdomain sudo[143327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkoaxtwnduarftgmfkkvyqzjvhxpphiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838137.4846272-648-224584566434328/AnsiballZ_systemd.py
Feb 23 09:15:37 np0005626463.localdomain sudo[143327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:38 np0005626463.localdomain python3.9[143329]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:15:38 np0005626463.localdomain systemd[1]: Reloading OpenSSH server daemon...
Feb 23 09:15:38 np0005626463.localdomain sshd[122114]: Received SIGHUP; restarting.
Feb 23 09:15:38 np0005626463.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Feb 23 09:15:38 np0005626463.localdomain sshd[122114]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:38 np0005626463.localdomain sshd[122114]: Server listening on 0.0.0.0 port 22.
Feb 23 09:15:38 np0005626463.localdomain sshd[122114]: Server listening on :: port 22.
Feb 23 09:15:38 np0005626463.localdomain sudo[143327]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:38 np0005626463.localdomain sudo[143424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqmbhshvkjsrqosbecghtgcurqxwktsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838138.6989288-672-200530722625121/AnsiballZ_file.py
Feb 23 09:15:38 np0005626463.localdomain sudo[143424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:39 np0005626463.localdomain python3.9[143426]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:39 np0005626463.localdomain sudo[143424]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34233 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7C9F30000000001030307) 
Feb 23 09:15:39 np0005626463.localdomain sudo[143516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnftehtlnhvtnxouzazwzxvcpdmgsoka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838139.3459609-696-193298550226613/AnsiballZ_stat.py
Feb 23 09:15:39 np0005626463.localdomain sudo[143516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:39 np0005626463.localdomain python3.9[143518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:39 np0005626463.localdomain sudo[143516]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:40 np0005626463.localdomain sudo[143589]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkorfepplixxxjrjcftnytiteomklbvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838139.3459609-696-193298550226613/AnsiballZ_copy.py
Feb 23 09:15:40 np0005626463.localdomain sudo[143589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:40 np0005626463.localdomain python3.9[143591]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838139.3459609-696-193298550226613/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:40 np0005626463.localdomain sudo[143589]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:41 np0005626463.localdomain sshd[143259]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:15:41 np0005626463.localdomain sudo[143681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uksgmksnpoqberqihqsliwwadgkdeaen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838141.0504656-750-38632882681603/AnsiballZ_timezone.py
Feb 23 09:15:41 np0005626463.localdomain sudo[143681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:41 np0005626463.localdomain python3.9[143683]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 23 09:15:41 np0005626463.localdomain systemd[1]: Starting Time & Date Service...
Feb 23 09:15:41 np0005626463.localdomain systemd[1]: Started Time & Date Service.
Feb 23 09:15:41 np0005626463.localdomain sudo[143681]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:42 np0005626463.localdomain sudo[143777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzrwfebshpypvblrpguqxftqgjuliwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838142.1742477-777-60039961464850/AnsiballZ_file.py
Feb 23 09:15:42 np0005626463.localdomain sudo[143777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34235 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7D6060000000001030307) 
Feb 23 09:15:42 np0005626463.localdomain python3.9[143779]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:42 np0005626463.localdomain sudo[143777]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:43 np0005626463.localdomain sudo[143869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgctpuhtsawzrorsjvspgccwmsgncsmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838142.8002887-801-231261638333350/AnsiballZ_stat.py
Feb 23 09:15:43 np0005626463.localdomain sudo[143869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:43 np0005626463.localdomain python3.9[143871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:43 np0005626463.localdomain sudo[143869]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:43 np0005626463.localdomain sshd[143942]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:15:43 np0005626463.localdomain sudo[143943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpywiprnfzwkxtuirywqdwkrhsxivbtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838142.8002887-801-231261638333350/AnsiballZ_copy.py
Feb 23 09:15:43 np0005626463.localdomain sudo[143943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:43 np0005626463.localdomain python3.9[143945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838142.8002887-801-231261638333350/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:43 np0005626463.localdomain sudo[143943]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:44 np0005626463.localdomain sshd[143942]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:15:44 np0005626463.localdomain sudo[144036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mggqfrzjfgffjgcjcitrlkwgfcvwqpfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838143.9978013-846-148658485450048/AnsiballZ_stat.py
Feb 23 09:15:44 np0005626463.localdomain sudo[144036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:44 np0005626463.localdomain python3.9[144038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:44 np0005626463.localdomain sudo[144036]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:44 np0005626463.localdomain sudo[144109]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sralwldjumhyjjahrvjnsttaskvubpse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838143.9978013-846-148658485450048/AnsiballZ_copy.py
Feb 23 09:15:44 np0005626463.localdomain sudo[144109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8906 DF PROTO=TCP SPT=58462 DPT=9105 SEQ=299049209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7E0060000000001030307) 
Feb 23 09:15:45 np0005626463.localdomain python3.9[144111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838143.9978013-846-148658485450048/.source.yaml _original_basename=.1a7jpgsw follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:45 np0005626463.localdomain sudo[144109]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:45 np0005626463.localdomain sudo[144201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnuitepiwzxdffsrxaqasmgxhtsmroqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838145.2218115-891-96936750585835/AnsiballZ_stat.py
Feb 23 09:15:45 np0005626463.localdomain sudo[144201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:45 np0005626463.localdomain python3.9[144203]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:45 np0005626463.localdomain sudo[144201]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:46 np0005626463.localdomain sudo[144276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjblrhcevmcypilkpeckcttpynkyxzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838145.2218115-891-96936750585835/AnsiballZ_copy.py
Feb 23 09:15:46 np0005626463.localdomain sudo[144276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:46 np0005626463.localdomain python3.9[144278]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838145.2218115-891-96936750585835/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:46 np0005626463.localdomain sudo[144276]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:46 np0005626463.localdomain sudo[144368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmfwimjqqofvwcikxosszurlrpibrwyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838146.3963206-936-8499921407781/AnsiballZ_command.py
Feb 23 09:15:46 np0005626463.localdomain sudo[144368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:47 np0005626463.localdomain python3.9[144370]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:15:47 np0005626463.localdomain sudo[144368]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:47 np0005626463.localdomain sudo[144461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brxyxoabicwlcjusobzhgabygejbvmzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838147.440685-960-53212457307010/AnsiballZ_command.py
Feb 23 09:15:47 np0005626463.localdomain sudo[144461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:47 np0005626463.localdomain python3.9[144463]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:15:47 np0005626463.localdomain sudo[144461]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:48 np0005626463.localdomain sudo[144554]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmvkqtkeeqgtamauqhwzhmqiaurfpwkz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838148.0905674-984-208981892839005/AnsiballZ_edpm_nftables_from_files.py
Feb 23 09:15:48 np0005626463.localdomain sudo[144554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1405 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=901113687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7EE060000000001030307) 
Feb 23 09:15:48 np0005626463.localdomain python3[144556]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 09:15:48 np0005626463.localdomain sudo[144554]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:49 np0005626463.localdomain sudo[144646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzvlxstydxcybytulqryjhatfhczison ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838148.9605737-1008-266884096942049/AnsiballZ_stat.py
Feb 23 09:15:49 np0005626463.localdomain sudo[144646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:49 np0005626463.localdomain python3.9[144648]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:49 np0005626463.localdomain sudo[144646]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:49 np0005626463.localdomain sudo[144719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qagmvvohkoezjmoojhfzmxpntqdgxrhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838148.9605737-1008-266884096942049/AnsiballZ_copy.py
Feb 23 09:15:49 np0005626463.localdomain sudo[144719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:50 np0005626463.localdomain python3.9[144721]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838148.9605737-1008-266884096942049/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:50 np0005626463.localdomain sudo[144719]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:50 np0005626463.localdomain sudo[144811]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idriajrgtccknfvczjhdwryzrsircaot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838150.2007642-1053-79509395947350/AnsiballZ_stat.py
Feb 23 09:15:50 np0005626463.localdomain sudo[144811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:50 np0005626463.localdomain python3.9[144813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:50 np0005626463.localdomain sudo[144811]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:51 np0005626463.localdomain sudo[144884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fscdbsiumfkbmpgcmgrsasndqrpgxyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838150.2007642-1053-79509395947350/AnsiballZ_copy.py
Feb 23 09:15:51 np0005626463.localdomain sudo[144884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:51 np0005626463.localdomain python3.9[144886]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838150.2007642-1053-79509395947350/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:51 np0005626463.localdomain sudo[144884]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:51 np0005626463.localdomain sudo[144976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaxzsscbrtevlymufjhzbshneexwitvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838151.4395404-1098-24111126058740/AnsiballZ_stat.py
Feb 23 09:15:51 np0005626463.localdomain sudo[144976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:51 np0005626463.localdomain python3.9[144978]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:51 np0005626463.localdomain sudo[144976]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:52 np0005626463.localdomain sudo[145049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isqtdpyhilxrsfqflnfbklqjqswymiwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838151.4395404-1098-24111126058740/AnsiballZ_copy.py
Feb 23 09:15:52 np0005626463.localdomain sudo[145049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:52 np0005626463.localdomain python3.9[145051]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838151.4395404-1098-24111126058740/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:52 np0005626463.localdomain sudo[145049]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:52 np0005626463.localdomain sudo[145141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjdjcbpwaflblqkpfgoudvpazuywzsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838152.6633744-1143-228957053568640/AnsiballZ_stat.py
Feb 23 09:15:52 np0005626463.localdomain sudo[145141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:53 np0005626463.localdomain python3.9[145143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:53 np0005626463.localdomain sudo[145141]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:53 np0005626463.localdomain sudo[145214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgwnatdqocygackkjqinarbgolslprxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838152.6633744-1143-228957053568640/AnsiballZ_copy.py
Feb 23 09:15:53 np0005626463.localdomain sudo[145214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:53 np0005626463.localdomain python3.9[145216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838152.6633744-1143-228957053568640/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:53 np0005626463.localdomain sudo[145214]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11804 DF PROTO=TCP SPT=42786 DPT=9102 SEQ=2336433841 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE803700000000001030307) 
Feb 23 09:15:54 np0005626463.localdomain sudo[145306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjufcltdvjofvmugcldunzrgriblzgzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838153.8990192-1188-61025452854529/AnsiballZ_stat.py
Feb 23 09:15:54 np0005626463.localdomain sudo[145306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:54 np0005626463.localdomain python3.9[145308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:15:54 np0005626463.localdomain sudo[145306]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:54 np0005626463.localdomain sudo[145379]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubgddvldbjgcbwedzghbevfjqhignfgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838153.8990192-1188-61025452854529/AnsiballZ_copy.py
Feb 23 09:15:54 np0005626463.localdomain sudo[145379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34237 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE806070000000001030307) 
Feb 23 09:15:54 np0005626463.localdomain python3.9[145381]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838153.8990192-1188-61025452854529/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:54 np0005626463.localdomain sudo[145379]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:55 np0005626463.localdomain sudo[145471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcrtpuxzahxbfftllvqqakxjflrmzcvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838155.1792393-1233-212843993318186/AnsiballZ_file.py
Feb 23 09:15:55 np0005626463.localdomain sudo[145471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:55 np0005626463.localdomain python3.9[145473]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:55 np0005626463.localdomain sudo[145471]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:56 np0005626463.localdomain sudo[145563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drdudxypufvizsclnebidqlxyjxsbets ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838156.0915127-1257-159897948698523/AnsiballZ_command.py
Feb 23 09:15:56 np0005626463.localdomain sudo[145563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:56 np0005626463.localdomain python3.9[145565]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:15:56 np0005626463.localdomain sudo[145563]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:57 np0005626463.localdomain sudo[145658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrivetkgvsaqbmfikznylarwmdovudns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838156.7746491-1281-122056332903171/AnsiballZ_blockinfile.py
Feb 23 09:15:57 np0005626463.localdomain sudo[145658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:57 np0005626463.localdomain python3.9[145660]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:57 np0005626463.localdomain sudo[145658]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:57 np0005626463.localdomain sudo[145751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boojgsoqahlljcvanncqopuqaxiccvxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838157.7021575-1308-179824089842201/AnsiballZ_file.py
Feb 23 09:15:57 np0005626463.localdomain sudo[145751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:58 np0005626463.localdomain python3.9[145753]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:58 np0005626463.localdomain sudo[145751]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43898 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE814070000000001030307) 
Feb 23 09:15:58 np0005626463.localdomain sudo[145843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqwgtbxpohoyjkisbnuoryxqojeomssf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838158.2987382-1308-200767934585739/AnsiballZ_file.py
Feb 23 09:15:58 np0005626463.localdomain sudo[145843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:58 np0005626463.localdomain python3.9[145845]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:15:58 np0005626463.localdomain sudo[145843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:15:59 np0005626463.localdomain sudo[145935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svieqbwkyyhuvsncdlqkutvrlroomvar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838158.9798074-1353-205270844353179/AnsiballZ_mount.py
Feb 23 09:15:59 np0005626463.localdomain sudo[145935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:15:59 np0005626463.localdomain python3.9[145937]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 23 09:15:59 np0005626463.localdomain sudo[145935]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:00 np0005626463.localdomain sudo[146028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izkjznlbtwulkhbuhluypkizwitxwosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838159.9628637-1353-156295551639816/AnsiballZ_mount.py
Feb 23 09:16:00 np0005626463.localdomain sudo[146028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:00 np0005626463.localdomain python3.9[146030]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 23 09:16:00 np0005626463.localdomain sudo[146028]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:00 np0005626463.localdomain sshd[140866]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:16:00 np0005626463.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Feb 23 09:16:00 np0005626463.localdomain systemd-logind[759]: Session 43 logged out. Waiting for processes to exit.
Feb 23 09:16:00 np0005626463.localdomain systemd[1]: session-43.scope: Consumed 30.023s CPU time.
Feb 23 09:16:00 np0005626463.localdomain systemd-logind[759]: Removed session 43.
Feb 23 09:16:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36150 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE820070000000001030307) 
Feb 23 09:16:03 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45855 DF PROTO=TCP SPT=45286 DPT=9101 SEQ=1933992049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE826CC0000000001030307) 
Feb 23 09:16:05 np0005626463.localdomain sudo[146047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:16:05 np0005626463.localdomain sudo[146047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:16:05 np0005626463.localdomain sudo[146047]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:05 np0005626463.localdomain sudo[146062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:16:05 np0005626463.localdomain sudo[146062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:16:06 np0005626463.localdomain sudo[146062]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:06 np0005626463.localdomain sshd[146108]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:06 np0005626463.localdomain sshd[146108]: Accepted publickey for zuul from 192.168.122.30 port 47672 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:16:06 np0005626463.localdomain systemd-logind[759]: New session 44 of user zuul.
Feb 23 09:16:06 np0005626463.localdomain systemd[1]: Started Session 44 of User zuul.
Feb 23 09:16:06 np0005626463.localdomain sshd[146108]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:16:06 np0005626463.localdomain sudo[146125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:16:06 np0005626463.localdomain sudo[146125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:16:06 np0005626463.localdomain sudo[146125]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:07 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25767 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE836060000000001030307) 
Feb 23 09:16:07 np0005626463.localdomain sudo[146216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqmpyocyeztznuftwuouepehgjbktdrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838166.8757694-21-82523222210445/AnsiballZ_tempfile.py
Feb 23 09:16:07 np0005626463.localdomain sudo[146216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:07 np0005626463.localdomain python3.9[146218]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 23 09:16:07 np0005626463.localdomain sudo[146216]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:08 np0005626463.localdomain sudo[146308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzbeyrdcmsekoarnjvmpyumejszfqlcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838168.481634-93-113688304281658/AnsiballZ_stat.py
Feb 23 09:16:08 np0005626463.localdomain sudo[146308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:09 np0005626463.localdomain python3.9[146310]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:16:09 np0005626463.localdomain sudo[146308]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29843 DF PROTO=TCP SPT=43632 DPT=9882 SEQ=163079574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE83F230000000001030307) 
Feb 23 09:16:10 np0005626463.localdomain sudo[146402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqbgskqlncivuhhnpjydkvwlzzkhvoxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838169.809688-141-214631956650321/AnsiballZ_slurp.py
Feb 23 09:16:10 np0005626463.localdomain sudo[146402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:10 np0005626463.localdomain python3.9[146404]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Feb 23 09:16:10 np0005626463.localdomain sudo[146402]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:11 np0005626463.localdomain sudo[146494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzbyyrajynykdqoxmndzescrntdwqmbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838171.1422215-189-233192297255079/AnsiballZ_stat.py
Feb 23 09:16:11 np0005626463.localdomain sudo[146494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:11 np0005626463.localdomain python3.9[146496]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.w2vcy2ch follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:16:11 np0005626463.localdomain sudo[146494]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:11 np0005626463.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 23 09:16:12 np0005626463.localdomain sudo[146571]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-latzludrygfkphfwfukhhxdipkyypout ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838171.1422215-189-233192297255079/AnsiballZ_copy.py
Feb 23 09:16:12 np0005626463.localdomain sudo[146571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:12 np0005626463.localdomain python3.9[146573]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.w2vcy2ch mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838171.1422215-189-233192297255079/.source.w2vcy2ch _original_basename=.iiut_32_ follow=False checksum=d1d6d40786432d7ee1aec581e269930dfc2795e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:12 np0005626463.localdomain sudo[146571]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:13 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47715 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE84E070000000001030307) 
Feb 23 09:16:14 np0005626463.localdomain sudo[146663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aajlmuoxwteqeeqwgpvgyglecdnzkmbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838173.558852-279-242604252905193/AnsiballZ_setup.py
Feb 23 09:16:14 np0005626463.localdomain sudo[146663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:14 np0005626463.localdomain python3.9[146665]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:16:14 np0005626463.localdomain sudo[146663]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:15 np0005626463.localdomain sshd[146721]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:15 np0005626463.localdomain sudo[146757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtgjmeogijxuflcszxrewbsnnkdskvoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838175.3131936-328-262655576708345/AnsiballZ_blockinfile.py
Feb 23 09:16:15 np0005626463.localdomain sudo[146757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:15 np0005626463.localdomain python3.9[146759]: ansible-ansible.builtin.blockinfile Invoked with block=np0005626466.localdomain,192.168.122.108,np0005626466* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4dg5LfbOyIHJudQjfDyIcqYXRqMUeYQIpjQPmNS0Tl7/EpBaYixjqlNovKIWOwkS4E2n4hwPLSTGSihYb5BeUDw32T80RumycS2tjBCSLiuq93xpTOaL2X+7wykkOSfY5xya13qrTg0ROJip0B6PSSF+Rn28SAKLh91euCdRaxWTAMeOSTP9WeCA3d0gsgb4xSMMWZxR4o1BU2bixjAcJHAlKYDc1OGpKkirRoziu9Y4nq2lmbwTg5HiS8STVkqyGHba9k6IC0eF2ZmT6M2thoHatYVtjuUeEE9bSvaAFB8oSI9Np6+OaluvuoKJYjRA3dzEQOi4ft/wwUrJfvyypDAxKBkxo7lCWIDEBK5Zb9BVoo68psz2IVPNGNZJtKXiq58CAqZTR02l/wEq4wB1/hp7ZW+ZMnHQUq1FpGITIA89KZeL9xNlnHqYak58B2GCYgK6OdvWktr4WHN8nbEmwZvaTrijZvnww7h2FQG4BMcSlO6AWKAdjksJZlVDYLJs=
                                                            np0005626466.localdomain,192.168.122.108,np0005626466* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIiaRdmYDJrMg8atO+fnuqzJdDL1JaVGt341/g0QTv04
                                                            np0005626466.localdomain,192.168.122.108,np0005626466* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGMyxprJk2KMNU4/eWUo8EdX2W79HO4pGHl3Ze8LEhDdSbCzY8uy6KD6met+RL0bD767zsXbqEV/9peHg1x5qjM=
                                                            np0005626459.localdomain,192.168.122.103,np0005626459* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9VsrIfV6Z4AiMtHfmjOpcBCt5sMsGmP0fOSak1UBP4r9lW4eYyoJY7Rtt1LDAcbGqdL3Nh3yc8ub0ekpXF6MA0vKucLb+jtjexv6t21W2grJ+ucwsvDhTDhDXmOUwD5G7A9Zj2WDqt/DN4DxeEqvQ6v1dSQaG+17BVPvM7mhgd5CSYOdUphCC81TPZgj3xyK31Q89biIS6pCBSKnsyN7qcU38bFGvRN0sTFaFt9KrIUfJJdcAZudw5Q/R775pmaaeHTSVPL05gE7dyz8RicEpenh6X0aZCOVt0+4VBnfXXSIL9QIwjrarPPKRdtmQY7dZ3dVNI1ZWA5YOl0y6R3fmxaRV5y1ZkDW6vG0463hYjKaAVqILAAPZGzhuzL7/1zxIv0guUB58tOUrCkkPIRzd6NQLL2j8L7RLIj3bZjG2xf0WiierxPsCEhl3wmdIVRUReE6jYalNGlscGUr1JWproKoaQqfck0OWhGy7jCCe8Gd8a/pr7jtg+X3bEMQ3HAc=
                                                            np0005626459.localdomain,192.168.122.103,np0005626459* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB1G62+/VP1cWp/d17CbWxlG5w4IEqmUSSc9SyShSsKo
                                                            np0005626459.localdomain,192.168.122.103,np0005626459* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF1G+CYZWMPROBz875F8bjcexPOeozjteUw/Fu+xHwwpYK4DPmCNq+JbW1AmCaltVkHRnMMPqLBom+3c+ekTh4E=
                                                            np0005626460.localdomain,192.168.122.104,np0005626460* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeQmwl5IUCA7h6xphf+o3WARi0Xlj+0K08ltN/FCX7iF0EALCfDqtKOHz7wv5gS04Zx4aeNfcVHv9bHLRJxTPzliSNVutqA7vdFa0R/kRMdNzkqSOCuJ64sQ8GwSOHSrcFy7qC87BuP6xB9atSBjpAEB4NZOuXbvmSN/dCa/nNpUWoWNNg3eR5AalrExCptFYZ4E7YWvJ6HdZpr1QhcAJW0V1y4+u4FfzxHT2SQfGmua4TFHH1lUMiMrgAoELLe+pYdnWooEhRlkPulWy/wOyNz7aCCDP462XBhCc0CmiBDRwMBaJISck1pJCOIksvu8TYa6Fp8aayZqJvbUJYl5C1Z/o+zgHMTjeec0Th5GIuw9XUJkkx8TT5Fh7aWJvX9BbHlMaJjAqc+G/wiIImvKlsuIsovU6TH0P/XiysoWXeUWM7JqR8Y/05+yELy+xAMKT7PfEXE1fWOlGcCJsarLYGhh/7Jypwfh8Y/wOtYdKOGODxDnzq2f2VySsEiAf0EL0=
                                                            np0005626460.localdomain,192.168.122.104,np0005626460* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILDN/X/h1SJivdlJg6UrBmlF7YgESQ24kCjH//omBjn3
                                                            np0005626460.localdomain,192.168.122.104,np0005626460* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLjVObKHLJCn+kOorv0tRLu5M/EwGgxQnczR69veoTwgXNRB/xCzi30v7fJ2uWbQGJXou02P5IiwAQmFSv1vKpE=
                                                            np0005626465.localdomain,192.168.122.107,np0005626465* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCUc8l2oYgfdO7xb3vN27co3Q/sFNU6Rw5wThiW1JMfeIzI90ZzS/L+BpsDsX8q2CW9QOHXrbUormpGsiNnix5j1P29Jc6e9A2mDlipXBrFSUiVZa8UOL03lFSz4nElapkASin2GCdHqy7//gGdQMKRP62VXpdhofb7i/N/gGoV5hSc8Q36KFDbWpvPkhD5H8nZtAfyxM99KwlC62D8jSN+gdoRtMRFPQTtyvyskyrgnXGC6xV71WTa6LJ6Meo7tfj4JlvDAWwlD+f9Ruu2ty2aHd2feVVKYvxZ4Z45iSfJnNxRFJvu1QOY0IU4Fj942leKwr6f0B5ogPFlTI7wRrAB1d9tri1WW2aL1AqYhdZscWi0VArYxLQr7BCVqz8KgFIzjbPoJ7uYnWcuDSiWlC1NJVO7Ij2natf8wZyvSyH+vydamkyoaNwxMnm4qs0/rvjwL49MdrHB79rXjHYJpt/JCBvn9a/rh5KqVH40P00DP35H71zyHPCSu1L20S/wY1k=
                                                            np0005626465.localdomain,192.168.122.107,np0005626465* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMNM6I52u2PlIbUuPV1wF+vgd5UIhGpYLByAkJDxsiFm
                                                            np0005626465.localdomain,192.168.122.107,np0005626465* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOxvbePT9GQElB7TGQuLRzkjxtXeKA7IbYbWBmgWolf09tVtPZHcG12wdG6fePoATmwyX4PIJb5sC28KiqtOgIE=
                                                            np0005626461.localdomain,192.168.122.105,np0005626461* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDBCzU24t9gA5R+exm4rHJ2VytHuq8uUoKuu6SZ07dskKR77n7TwlsZhsDjpzwsddHd+lvsfvOVmolxjJsCmq7LJRMGA/mczHXsGGb43YPZPKsiJ6KMPDORy5/ihhnqixBYVmBGtdPu/Hh/udGnymZgR/RYGltDDHoCfGGiEcHJSIuf/Bv2Uv4xFnxFjDrWQFrkJ5Grq1xC7cGXgC3gAiTCjGHkG9rb/oyTUjjM8LaaRYIjeoDQZu1/8y5pl6cnhW21VTA+u55SkSimb/g5oOuSmrv899iHFwb54uLINXvA4aTtduUnxNQBVRyFvWa3yCZXVJeYlcVP8Q9tljn9anN1aISnS311Jmay6zUY927bxnzrpkwaV7Ggwtvi6vlVy84ZvOJ/IJ2boDiMujh1ZpT3bxXG3Oy0EjfBVbpkS6r2MbGTPj/xWnosJ6JNVbb9LW7Ftfi3/NFfAb7PpTgY036DA8LYoYIfqxVJUhlo5fJjqqOLa/zbvZVwrFCG+Zm160=
                                                            np0005626461.localdomain,192.168.122.105,np0005626461* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAID3BKJ5iitZOMOyRmWwrIHEgrBaSUAXcN/yddsH5p67P
                                                            np0005626461.localdomain,192.168.122.105,np0005626461* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCX+ELPnNre0Bl1NdaYE8R/rtodFHjWfK7n06TW2wvAyLhge/A+53E2vGTXA9jfYXEEH2g0XKcYHlkb3dM70CTQ=
                                                            np0005626463.localdomain,192.168.122.106,np0005626463* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/Caj4zYKd24ctvaRU1Hf9nT058OF4bRnDJ3bHimmkyIL7cccXAxo3lx50wZHWRYBhF5Wes6TmqnUTTK1h5wVdI8f7YtQ9IyMIlfoEiTThF5PgODVuRYq+YGjFIy7MTPyBnB2428aT4dlYqHSuxK2gL6ALlCJHNyeh3RW3jCOG89veDoRmbqHGoaD+xPRnfsdHLoLFNfxT4UJiKRuqsEd5fNtc392ROSa5XM3PPIs3YTypYmpfFHs1B1j+y6oZV8Ha/QXqURpI7/aJmfnDzXLMsLWp4GRpkwzljvNp87S5HL+kJMo79n0Vmh2JdN1orNP/4A2t/TENckHbrZCm+YmPqUqvpHkAZfFfmvP62YZTPq/qOjBMMq6ulGSHd2I4XfE7NNZRKoS3G4HVlBb0ONS13PaWx9rrJCRlF64L1dHSt9zpKrvRbWkSdXA0PwwehrU5/OBo1IY4WsRlWmPeET1/dFWiIr1t9uGjp5vmACAx7rnC6G5qSEhQ3/k1Wa57k/k=
                                                            np0005626463.localdomain,192.168.122.106,np0005626463* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPpIpPeSZdEjLEgb7zYHVhKnwBDipROOVgmUJe3QzecH
                                                            np0005626463.localdomain,192.168.122.106,np0005626463* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJUV/eK8X671P+PPyOxoifS2hhEKYup7ygc301iPJDoOs3TgLodw2jNy/egXEc0x3WdkTwXltmBlHqmWw5ro05Q=
                                                             create=True mode=0644 path=/tmp/ansible.w2vcy2ch state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:15 np0005626463.localdomain sudo[146757]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:17 np0005626463.localdomain sudo[146849]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooqjbhlhbvfttnrjwljpbpxvnkrgumto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838176.700781-376-181125007532911/AnsiballZ_command.py
Feb 23 09:16:17 np0005626463.localdomain sudo[146849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:17 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8907 DF PROTO=TCP SPT=58462 DPT=9105 SEQ=299049209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE85E060000000001030307) 
Feb 23 09:16:17 np0005626463.localdomain python3.9[146851]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.w2vcy2ch' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:16:17 np0005626463.localdomain sudo[146849]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:17 np0005626463.localdomain sshd[146721]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:16:18 np0005626463.localdomain sudo[146943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mieylcojxyllknjgjsabioiobdzshytp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838178.070629-424-6742241332385/AnsiballZ_file.py
Feb 23 09:16:18 np0005626463.localdomain sudo[146943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:18 np0005626463.localdomain python3.9[146945]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.w2vcy2ch state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:18 np0005626463.localdomain sudo[146943]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:19 np0005626463.localdomain sshd[146108]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:16:19 np0005626463.localdomain systemd-logind[759]: Session 44 logged out. Waiting for processes to exit.
Feb 23 09:16:19 np0005626463.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Feb 23 09:16:19 np0005626463.localdomain systemd[1]: session-44.scope: Consumed 4.375s CPU time.
Feb 23 09:16:19 np0005626463.localdomain systemd-logind[759]: Removed session 44.
Feb 23 09:16:22 np0005626463.localdomain sshd[146960]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:22 np0005626463.localdomain sshd[146960]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:16:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33432 DF PROTO=TCP SPT=57960 DPT=9102 SEQ=315792718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE878A10000000001030307) 
Feb 23 09:16:24 np0005626463.localdomain sshd[146962]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:24 np0005626463.localdomain sshd[146962]: Accepted publickey for zuul from 192.168.122.30 port 43148 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:16:24 np0005626463.localdomain systemd-logind[759]: New session 45 of user zuul.
Feb 23 09:16:24 np0005626463.localdomain systemd[1]: Started Session 45 of User zuul.
Feb 23 09:16:24 np0005626463.localdomain sshd[146962]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:16:25 np0005626463.localdomain python3.9[147055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:16:26 np0005626463.localdomain sudo[147149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gikhbjmoinccbfbjhxpqibkpxywhjlxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838186.225625-51-23128697649947/AnsiballZ_systemd.py
Feb 23 09:16:26 np0005626463.localdomain sudo[147149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:27 np0005626463.localdomain python3.9[147151]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 09:16:28 np0005626463.localdomain sudo[147149]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:28 np0005626463.localdomain sudo[147243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnltmjfcacslxleqryrtryaymuguiowq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838188.270181-75-52924548535750/AnsiballZ_systemd.py
Feb 23 09:16:28 np0005626463.localdomain sudo[147243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:28 np0005626463.localdomain python3.9[147245]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:16:28 np0005626463.localdomain sudo[147243]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:29 np0005626463.localdomain sudo[147336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fotljqxxnymkjntaovmwodwlalilgwzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838189.1525917-102-52530054002121/AnsiballZ_command.py
Feb 23 09:16:29 np0005626463.localdomain sudo[147336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:29 np0005626463.localdomain python3.9[147338]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:16:29 np0005626463.localdomain sudo[147336]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:29 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45380 DF PROTO=TCP SPT=38814 DPT=9100 SEQ=2662443519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE88F160000000001030307) 
Feb 23 09:16:30 np0005626463.localdomain sudo[147429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvljnvkuozwcynpilofdmmsizttdnhcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838190.006582-126-46109588307947/AnsiballZ_stat.py
Feb 23 09:16:30 np0005626463.localdomain sudo[147429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:30 np0005626463.localdomain python3.9[147431]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:16:30 np0005626463.localdomain sudo[147429]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:31 np0005626463.localdomain sudo[147523]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twyuognjjekxnmthypagveijbdswwwqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838190.7582037-150-7386822295942/AnsiballZ_command.py
Feb 23 09:16:31 np0005626463.localdomain sudo[147523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:31 np0005626463.localdomain python3.9[147525]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:16:31 np0005626463.localdomain sudo[147523]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:32 np0005626463.localdomain sudo[147618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhidjlsxbrtmtnaijzvyiqwnpxchwghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838191.6090357-174-118417520049453/AnsiballZ_file.py
Feb 23 09:16:32 np0005626463.localdomain sudo[147618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:32 np0005626463.localdomain python3.9[147620]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:32 np0005626463.localdomain sudo[147618]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:32 np0005626463.localdomain sshd[146962]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:16:32 np0005626463.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Feb 23 09:16:32 np0005626463.localdomain systemd[1]: session-45.scope: Consumed 3.890s CPU time.
Feb 23 09:16:32 np0005626463.localdomain systemd-logind[759]: Session 45 logged out. Waiting for processes to exit.
Feb 23 09:16:32 np0005626463.localdomain systemd-logind[759]: Removed session 45.
Feb 23 09:16:33 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51162 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE89BFB0000000001030307) 
Feb 23 09:16:34 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51163 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8A0060000000001030307) 
Feb 23 09:16:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51164 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8A8070000000001030307) 
Feb 23 09:16:37 np0005626463.localdomain sshd[147635]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:37 np0005626463.localdomain sshd[147635]: Accepted publickey for zuul from 192.168.122.30 port 49054 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:16:37 np0005626463.localdomain systemd-logind[759]: New session 46 of user zuul.
Feb 23 09:16:37 np0005626463.localdomain systemd[1]: Started Session 46 of User zuul.
Feb 23 09:16:37 np0005626463.localdomain sshd[147635]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:16:38 np0005626463.localdomain python3.9[147728]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:16:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44083 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B4530000000001030307) 
Feb 23 09:16:39 np0005626463.localdomain sudo[147822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttfgqlcdxthwlzrwxjzyplzqmpdjnkgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838199.3196387-57-161071980331705/AnsiballZ_setup.py
Feb 23 09:16:39 np0005626463.localdomain sudo[147822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:39 np0005626463.localdomain python3.9[147824]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:16:40 np0005626463.localdomain sudo[147822]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:40 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51165 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B7C60000000001030307) 
Feb 23 09:16:40 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44084 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B8460000000001030307) 
Feb 23 09:16:40 np0005626463.localdomain sudo[147876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mphviztkeiaoueabryezpdwjvwxjwfuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838199.3196387-57-161071980331705/AnsiballZ_dnf.py
Feb 23 09:16:40 np0005626463.localdomain sudo[147876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:40 np0005626463.localdomain python3.9[147878]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 23 09:16:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44085 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8C0460000000001030307) 
Feb 23 09:16:44 np0005626463.localdomain sudo[147876]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:45 np0005626463.localdomain python3.9[147970]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:16:46 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3622 DF PROTO=TCP SPT=49526 DPT=9105 SEQ=1733835990 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8D0060000000001030307) 
Feb 23 09:16:46 np0005626463.localdomain sudo[148061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejsauciapehodxtlgucqljhgiwlmcibh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838205.9677486-120-46222709697378/AnsiballZ_file.py
Feb 23 09:16:46 np0005626463.localdomain sudo[148061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:46 np0005626463.localdomain python3.9[148063]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:46 np0005626463.localdomain sudo[148061]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:47 np0005626463.localdomain sudo[148153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caaunqpkmlbzjfdzlvmlcbfkqobnntnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838207.352341-144-87502268236776/AnsiballZ_file.py
Feb 23 09:16:47 np0005626463.localdomain sudo[148153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:47 np0005626463.localdomain python3.9[148155]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:47 np0005626463.localdomain sudo[148153]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51166 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8D8070000000001030307) 
Feb 23 09:16:48 np0005626463.localdomain sudo[148245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aunqikmbjkysqomnompnnfypbpoepsnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838208.3130739-168-250547706403758/AnsiballZ_lineinfile.py
Feb 23 09:16:48 np0005626463.localdomain sudo[148245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:16:48 np0005626463.localdomain python3.9[148247]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:16:48 np0005626463.localdomain sudo[148245]: pam_unix(sudo:session): session closed for user root
Feb 23 09:16:49 np0005626463.localdomain python3.9[148337]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:16:50 np0005626463.localdomain python3.9[148427]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:16:51 np0005626463.localdomain python3.9[148519]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:16:51 np0005626463.localdomain sshd[147635]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:16:51 np0005626463.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Feb 23 09:16:51 np0005626463.localdomain systemd[1]: session-46.scope: Consumed 9.001s CPU time.
Feb 23 09:16:51 np0005626463.localdomain systemd-logind[759]: Session 46 logged out. Waiting for processes to exit.
Feb 23 09:16:51 np0005626463.localdomain systemd-logind[759]: Removed session 46.
Feb 23 09:16:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60006 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=4217256930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8EDD10000000001030307) 
Feb 23 09:16:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44087 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8F0060000000001030307) 
Feb 23 09:16:57 np0005626463.localdomain sshd[148536]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60008 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=4217256930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8F9C70000000001030307) 
Feb 23 09:16:57 np0005626463.localdomain sshd[148537]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:16:57 np0005626463.localdomain sshd[148537]: Accepted publickey for zuul from 192.168.122.30 port 51228 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:16:57 np0005626463.localdomain systemd-logind[759]: New session 47 of user zuul.
Feb 23 09:16:57 np0005626463.localdomain systemd[1]: Started Session 47 of User zuul.
Feb 23 09:16:57 np0005626463.localdomain sshd[148537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:16:57 np0005626463.localdomain sshd[148536]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:16:58 np0005626463.localdomain python3.9[148631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:17:00 np0005626463.localdomain sudo[148725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxgwyclblndkyyccwetwcuyfoxxhtbcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838219.9612794-153-255823395639478/AnsiballZ_file.py
Feb 23 09:17:00 np0005626463.localdomain sudo[148725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:00 np0005626463.localdomain python3.9[148727]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:00 np0005626463.localdomain sudo[148725]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52483 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE908460000000001030307) 
Feb 23 09:17:00 np0005626463.localdomain sshd[148787]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:17:01 np0005626463.localdomain sudo[148819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxrzhffqpanvrsypqzeorufcnmsnjovd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838220.7216723-177-270581568256455/AnsiballZ_stat.py
Feb 23 09:17:01 np0005626463.localdomain sudo[148819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:01 np0005626463.localdomain python3.9[148821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:01 np0005626463.localdomain sudo[148819]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:01 np0005626463.localdomain sshd[148787]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:17:01 np0005626463.localdomain sudo[148892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aovfmfswkupmyaefnlvajgaiiskvrllv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838220.7216723-177-270581568256455/AnsiballZ_copy.py
Feb 23 09:17:01 np0005626463.localdomain sudo[148892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:02 np0005626463.localdomain python3.9[148894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838220.7216723-177-270581568256455/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:02 np0005626463.localdomain sudo[148892]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:02 np0005626463.localdomain sudo[148984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zknndjqjfowrhxuckhutqqwcoljrnnui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838222.2046838-226-67152738764900/AnsiballZ_file.py
Feb 23 09:17:02 np0005626463.localdomain sudo[148984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:02 np0005626463.localdomain python3.9[148986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:02 np0005626463.localdomain sudo[148984]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52484 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE910460000000001030307) 
Feb 23 09:17:03 np0005626463.localdomain sudo[149076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fojxixfquplrzsbgcchxuyjcbanqoezu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838222.8237774-250-43749575388980/AnsiballZ_stat.py
Feb 23 09:17:03 np0005626463.localdomain sudo[149076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:03 np0005626463.localdomain python3.9[149078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:03 np0005626463.localdomain sudo[149076]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:03 np0005626463.localdomain sudo[149149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhxlgdbcykcfpxgvsrqyqictodgivbld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838222.8237774-250-43749575388980/AnsiballZ_copy.py
Feb 23 09:17:03 np0005626463.localdomain sudo[149149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:03 np0005626463.localdomain python3.9[149151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838222.8237774-250-43749575388980/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:03 np0005626463.localdomain sudo[149149]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:04 np0005626463.localdomain sudo[149241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpyaxvigjividppjvovpeugtndafidrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838224.028974-299-106552338184343/AnsiballZ_file.py
Feb 23 09:17:04 np0005626463.localdomain sudo[149241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:04 np0005626463.localdomain python3.9[149243]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:04 np0005626463.localdomain sudo[149241]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:04 np0005626463.localdomain sudo[149333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvzknpjtqjvfbtpmrmnzlandkhedyown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838224.682711-323-105024199746830/AnsiballZ_stat.py
Feb 23 09:17:04 np0005626463.localdomain sudo[149333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:05 np0005626463.localdomain python3.9[149335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:05 np0005626463.localdomain sudo[149333]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:05 np0005626463.localdomain sudo[149406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgrrwtkquccljvkwdmtzkualeoplfbiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838224.682711-323-105024199746830/AnsiballZ_copy.py
Feb 23 09:17:05 np0005626463.localdomain sudo[149406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:05 np0005626463.localdomain python3.9[149408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838224.682711-323-105024199746830/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:05 np0005626463.localdomain sudo[149406]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61162 DF PROTO=TCP SPT=44394 DPT=9101 SEQ=2399873393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE91D470000000001030307) 
Feb 23 09:17:06 np0005626463.localdomain sudo[149498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfplfiomqbjubwizlvldsuokfouvgyuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838226.0225961-371-65191555014785/AnsiballZ_file.py
Feb 23 09:17:06 np0005626463.localdomain sudo[149498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:06 np0005626463.localdomain python3.9[149500]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:06 np0005626463.localdomain sudo[149498]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:06 np0005626463.localdomain sudo[149590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlqewslkbrwctkfszrcqxrxtsvmqdsiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838226.6899624-396-72216657166734/AnsiballZ_stat.py
Feb 23 09:17:06 np0005626463.localdomain sudo[149590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:07 np0005626463.localdomain python3.9[149592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:07 np0005626463.localdomain sudo[149590]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:07 np0005626463.localdomain sudo[149593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:17:07 np0005626463.localdomain sudo[149593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:17:07 np0005626463.localdomain sudo[149593]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:07 np0005626463.localdomain sudo[149621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:17:07 np0005626463.localdomain sudo[149621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:17:07 np0005626463.localdomain sudo[149693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvrrjcodacuuotfalsgnnpafmxnqgwnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838226.6899624-396-72216657166734/AnsiballZ_copy.py
Feb 23 09:17:07 np0005626463.localdomain sudo[149693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:07 np0005626463.localdomain python3.9[149695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838226.6899624-396-72216657166734/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:07 np0005626463.localdomain sudo[149693]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:08 np0005626463.localdomain sudo[149621]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:08 np0005626463.localdomain sudo[149816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjsoqjtmehuayoorhyookskonbnjyhjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838227.9512975-447-204289630565419/AnsiballZ_file.py
Feb 23 09:17:08 np0005626463.localdomain sudo[149816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:08 np0005626463.localdomain python3.9[149818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:08 np0005626463.localdomain sudo[149816]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:08 np0005626463.localdomain sudo[149878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:17:08 np0005626463.localdomain sudo[149878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:17:08 np0005626463.localdomain sudo[149878]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:08 np0005626463.localdomain sudo[149923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iurroktbyzjoxzjcpsreylctudyizxgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838228.5334349-470-123841446130468/AnsiballZ_stat.py
Feb 23 09:17:08 np0005626463.localdomain sudo[149923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:08 np0005626463.localdomain python3.9[149925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:09 np0005626463.localdomain sudo[149923]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28238 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE929820000000001030307) 
Feb 23 09:17:09 np0005626463.localdomain sudo[149996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzfsrwssxgwszjlavlddtgacshaqnleu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838228.5334349-470-123841446130468/AnsiballZ_copy.py
Feb 23 09:17:09 np0005626463.localdomain sudo[149996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:09 np0005626463.localdomain python3.9[149998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838228.5334349-470-123841446130468/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:09 np0005626463.localdomain sudo[149996]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:10 np0005626463.localdomain sudo[150088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbnvsorgntnpfhlnengkasvikpvuomfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838229.798321-516-94490927019970/AnsiballZ_file.py
Feb 23 09:17:10 np0005626463.localdomain sudo[150088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:10 np0005626463.localdomain python3.9[150090]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:10 np0005626463.localdomain sudo[150088]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:10 np0005626463.localdomain sudo[150180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjmhfnmdhstvtzjzevkshonnyetytlqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838230.372813-536-237209744266713/AnsiballZ_stat.py
Feb 23 09:17:10 np0005626463.localdomain sudo[150180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:10 np0005626463.localdomain python3.9[150182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:10 np0005626463.localdomain sudo[150180]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:11 np0005626463.localdomain sudo[150253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fujkmsrdvrfmqurkzjeycvealvyqsfsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838230.372813-536-237209744266713/AnsiballZ_copy.py
Feb 23 09:17:11 np0005626463.localdomain sudo[150253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:11 np0005626463.localdomain python3.9[150255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838230.372813-536-237209744266713/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:11 np0005626463.localdomain sudo[150253]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:11 np0005626463.localdomain sudo[150345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcezpoclmgunruosjuwnoiardmvsnqwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838231.5856771-586-254662878223888/AnsiballZ_file.py
Feb 23 09:17:11 np0005626463.localdomain sudo[150345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:12 np0005626463.localdomain python3.9[150347]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:12 np0005626463.localdomain sudo[150345]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28240 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE935860000000001030307) 
Feb 23 09:17:12 np0005626463.localdomain sudo[150438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgktznljtijhbxudncvqdwtrvyabfjli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838232.2018623-611-25720325986337/AnsiballZ_stat.py
Feb 23 09:17:12 np0005626463.localdomain sudo[150438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:12 np0005626463.localdomain python3.9[150440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:12 np0005626463.localdomain sudo[150438]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:12 np0005626463.localdomain sudo[150511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umrefmpezzwuajmtczywfhlivuwhewkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838232.2018623-611-25720325986337/AnsiballZ_copy.py
Feb 23 09:17:12 np0005626463.localdomain sudo[150511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:13 np0005626463.localdomain python3.9[150513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838232.2018623-611-25720325986337/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:13 np0005626463.localdomain sudo[150511]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:13 np0005626463.localdomain sudo[150603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaqnfgvrjgnbekksuiexudyknepxqxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838233.3716319-658-42772289597222/AnsiballZ_file.py
Feb 23 09:17:13 np0005626463.localdomain sudo[150603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:13 np0005626463.localdomain python3.9[150605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:13 np0005626463.localdomain sudo[150603]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:14 np0005626463.localdomain sudo[150695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwxkmfminpidrejdhpaxodzdmzoaqwpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838233.9995575-683-205098696455471/AnsiballZ_stat.py
Feb 23 09:17:14 np0005626463.localdomain sudo[150695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:14 np0005626463.localdomain python3.9[150697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:14 np0005626463.localdomain sudo[150695]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:14 np0005626463.localdomain sudo[150768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pukmgltctyquhqtpuacnschkoxbmdxxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838233.9995575-683-205098696455471/AnsiballZ_copy.py
Feb 23 09:17:14 np0005626463.localdomain sudo[150768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:15 np0005626463.localdomain python3.9[150770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838233.9995575-683-205098696455471/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:15 np0005626463.localdomain sudo[150768]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52486 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE940060000000001030307) 
Feb 23 09:17:15 np0005626463.localdomain sshd[148537]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:17:15 np0005626463.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Feb 23 09:17:15 np0005626463.localdomain systemd[1]: session-47.scope: Consumed 11.729s CPU time.
Feb 23 09:17:15 np0005626463.localdomain systemd-logind[759]: Session 47 logged out. Waiting for processes to exit.
Feb 23 09:17:15 np0005626463.localdomain systemd-logind[759]: Removed session 47.
Feb 23 09:17:17 np0005626463.localdomain chronyd[140850]: Selected source 167.160.187.12 (pool.ntp.org)
Feb 23 09:17:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61164 DF PROTO=TCP SPT=44394 DPT=9101 SEQ=2399873393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE94E060000000001030307) 
Feb 23 09:17:21 np0005626463.localdomain sshd[150786]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:17:21 np0005626463.localdomain sshd[150786]: Accepted publickey for zuul from 192.168.122.31 port 36002 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:17:21 np0005626463.localdomain systemd-logind[759]: New session 48 of user zuul.
Feb 23 09:17:21 np0005626463.localdomain systemd[1]: Started Session 48 of User zuul.
Feb 23 09:17:21 np0005626463.localdomain sshd[150786]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:17:21 np0005626463.localdomain sudo[150879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oznttnhzpzqldcxrrxjpbmbcxxgilskl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838241.3573642-21-194399756493502/AnsiballZ_file.py
Feb 23 09:17:21 np0005626463.localdomain sudo[150879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:22 np0005626463.localdomain python3.9[150881]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:22 np0005626463.localdomain sudo[150879]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:22 np0005626463.localdomain sudo[150971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrwlwgptfgbshhkhjnhonqpacweilfpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838242.228574-57-235985671653714/AnsiballZ_stat.py
Feb 23 09:17:22 np0005626463.localdomain sudo[150971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:22 np0005626463.localdomain python3.9[150973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:22 np0005626463.localdomain sudo[150971]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:23 np0005626463.localdomain sudo[151044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccwtemitgqrawwvsytnucqqbatckswmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838242.228574-57-235985671653714/AnsiballZ_copy.py
Feb 23 09:17:23 np0005626463.localdomain sudo[151044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:23 np0005626463.localdomain python3.9[151046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838242.228574-57-235985671653714/.source.conf _original_basename=ceph.conf follow=False checksum=00be6682e39722cc7ebf9f74611435726ea0928d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:23 np0005626463.localdomain sudo[151044]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4671 DF PROTO=TCP SPT=36964 DPT=9102 SEQ=817969717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE962FF0000000001030307) 
Feb 23 09:17:24 np0005626463.localdomain sudo[151136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nekignlnavnvslnkcojeuhjdlwxdkxfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838243.686841-57-272677179248303/AnsiballZ_stat.py
Feb 23 09:17:24 np0005626463.localdomain sudo[151136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:24 np0005626463.localdomain python3.9[151138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:24 np0005626463.localdomain sudo[151136]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:24 np0005626463.localdomain sudo[151209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrwmeybzbouofthelfitgqznynhmglde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838243.686841-57-272677179248303/AnsiballZ_copy.py
Feb 23 09:17:24 np0005626463.localdomain sudo[151209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28242 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE966060000000001030307) 
Feb 23 09:17:25 np0005626463.localdomain python3.9[151211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838243.686841-57-272677179248303/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=bb97f2335ebfccbfb2bd8d50bbb589ce7e034c5d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:25 np0005626463.localdomain sudo[151209]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:25 np0005626463.localdomain sshd[150786]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:17:25 np0005626463.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Feb 23 09:17:25 np0005626463.localdomain systemd[1]: session-48.scope: Consumed 2.323s CPU time.
Feb 23 09:17:25 np0005626463.localdomain systemd-logind[759]: Session 48 logged out. Waiting for processes to exit.
Feb 23 09:17:25 np0005626463.localdomain systemd-logind[759]: Removed session 48.
Feb 23 09:17:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4673 DF PROTO=TCP SPT=36964 DPT=9102 SEQ=817969717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE96F060000000001030307) 
Feb 23 09:17:30 np0005626463.localdomain sshd[151226]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:17:30 np0005626463.localdomain sshd[151226]: Accepted publickey for zuul from 192.168.122.31 port 38102 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:17:30 np0005626463.localdomain systemd-logind[759]: New session 49 of user zuul.
Feb 23 09:17:30 np0005626463.localdomain systemd[1]: Started Session 49 of User zuul.
Feb 23 09:17:30 np0005626463.localdomain sshd[151226]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:17:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50481 DF PROTO=TCP SPT=60186 DPT=9100 SEQ=4212654414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE97D860000000001030307) 
Feb 23 09:17:31 np0005626463.localdomain sshd[151320]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:17:31 np0005626463.localdomain python3.9[151319]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:17:31 np0005626463.localdomain sshd[151320]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:17:32 np0005626463.localdomain sudo[151415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiepwcyxwwmszkvtedbotkvubyuieunu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838252.2656496-57-73350699319022/AnsiballZ_file.py
Feb 23 09:17:32 np0005626463.localdomain sudo[151415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:32 np0005626463.localdomain python3.9[151417]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:32 np0005626463.localdomain sudo[151415]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50482 DF PROTO=TCP SPT=60186 DPT=9100 SEQ=4212654414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE985870000000001030307) 
Feb 23 09:17:33 np0005626463.localdomain sudo[151507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncnnvvrqybxivixvehiwnfgslnoywqzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838253.0537908-57-146126528206303/AnsiballZ_file.py
Feb 23 09:17:33 np0005626463.localdomain sudo[151507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:33 np0005626463.localdomain python3.9[151509]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:17:33 np0005626463.localdomain sudo[151507]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:34 np0005626463.localdomain python3.9[151599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:17:35 np0005626463.localdomain sudo[151689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lypwxxfcdwgoddfjehvgzjrzdakhupjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838254.57694-126-94977079785743/AnsiballZ_seboolean.py
Feb 23 09:17:35 np0005626463.localdomain sudo[151689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:35 np0005626463.localdomain python3.9[151691]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 23 09:17:35 np0005626463.localdomain sudo[151689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:36 np0005626463.localdomain sudo[151781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pihzruhoxztytshydzuansfifancxtiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838255.750629-156-176194664489232/AnsiballZ_setup.py
Feb 23 09:17:36 np0005626463.localdomain sudo[151781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2755 DF PROTO=TCP SPT=44236 DPT=9101 SEQ=3556866097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE992470000000001030307) 
Feb 23 09:17:36 np0005626463.localdomain python3.9[151783]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:17:36 np0005626463.localdomain sudo[151781]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:37 np0005626463.localdomain sudo[151835]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlyyilkerwvzrvchrvkjeffrpxgzqrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838255.750629-156-176194664489232/AnsiballZ_dnf.py
Feb 23 09:17:37 np0005626463.localdomain sudo[151835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:37 np0005626463.localdomain python3.9[151837]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:17:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=931 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE99EB30000000001030307) 
Feb 23 09:17:40 np0005626463.localdomain sshd[151840]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:17:40 np0005626463.localdomain sudo[151835]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:40 np0005626463.localdomain sshd[151840]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:17:41 np0005626463.localdomain sudo[151931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itnbkkxsskkqqxqklnjcgxncoqccweiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838260.9247177-192-5655875131598/AnsiballZ_systemd.py
Feb 23 09:17:41 np0005626463.localdomain sudo[151931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:41 np0005626463.localdomain python3.9[151933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:17:41 np0005626463.localdomain sudo[151931]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=933 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9AAC60000000001030307) 
Feb 23 09:17:43 np0005626463.localdomain sudo[152026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llxyotjxeiydwbdkjlzkwtfhwsgjeqel ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838263.1141648-216-154883604206811/AnsiballZ_edpm_nftables_snippet.py
Feb 23 09:17:43 np0005626463.localdomain sudo[152026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:43 np0005626463.localdomain python3[152028]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 23 09:17:43 np0005626463.localdomain sudo[152026]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:44 np0005626463.localdomain sudo[152118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwzwamlfbedzxddxpnqthjfjcxeqvlmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838264.1217027-243-271791392018472/AnsiballZ_file.py
Feb 23 09:17:44 np0005626463.localdomain sudo[152118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:44 np0005626463.localdomain python3.9[152120]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:44 np0005626463.localdomain sudo[152118]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:45 np0005626463.localdomain sudo[152210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvuiqnfxjwovdmpiewpqeqpslpcdfhsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838264.7429872-267-105491541261295/AnsiballZ_stat.py
Feb 23 09:17:45 np0005626463.localdomain sudo[152210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28152 DF PROTO=TCP SPT=58040 DPT=9105 SEQ=955673459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9B6060000000001030307) 
Feb 23 09:17:45 np0005626463.localdomain python3.9[152212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:45 np0005626463.localdomain sudo[152210]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:45 np0005626463.localdomain sudo[152258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlqtnncmqqybnaxlancxvoeddodosdlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838264.7429872-267-105491541261295/AnsiballZ_file.py
Feb 23 09:17:45 np0005626463.localdomain sudo[152258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:45 np0005626463.localdomain python3.9[152260]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:45 np0005626463.localdomain sudo[152258]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:46 np0005626463.localdomain sudo[152350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtmkrvlzhfyzgxbgsiftcgacrtraweyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838266.109824-303-279442193437133/AnsiballZ_stat.py
Feb 23 09:17:46 np0005626463.localdomain sudo[152350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:46 np0005626463.localdomain python3.9[152352]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:46 np0005626463.localdomain sudo[152350]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:46 np0005626463.localdomain sudo[152398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfrawtvxmfmxepccyhwkylhtqgoamqsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838266.109824-303-279442193437133/AnsiballZ_file.py
Feb 23 09:17:46 np0005626463.localdomain sudo[152398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:47 np0005626463.localdomain python3.9[152400]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e6yqn866 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:47 np0005626463.localdomain sudo[152398]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:47 np0005626463.localdomain sudo[152490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxfstnwqvzrigtrlwnvdejeqeyuaiyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838267.2103844-339-263956786854843/AnsiballZ_stat.py
Feb 23 09:17:47 np0005626463.localdomain sudo[152490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:47 np0005626463.localdomain python3.9[152492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:47 np0005626463.localdomain sudo[152490]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:47 np0005626463.localdomain sudo[152538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-woesewbbcmojfzttmeakmkluavctsali ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838267.2103844-339-263956786854843/AnsiballZ_file.py
Feb 23 09:17:47 np0005626463.localdomain sudo[152538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:48 np0005626463.localdomain python3.9[152540]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:48 np0005626463.localdomain sudo[152538]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2757 DF PROTO=TCP SPT=44236 DPT=9101 SEQ=3556866097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9C2060000000001030307) 
Feb 23 09:17:48 np0005626463.localdomain sudo[152630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alvetazqrdojmfzohhsygmmzdserqzkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838268.3557298-378-150064260692621/AnsiballZ_command.py
Feb 23 09:17:48 np0005626463.localdomain sudo[152630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:48 np0005626463.localdomain python3.9[152632]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:17:48 np0005626463.localdomain sudo[152630]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:49 np0005626463.localdomain sudo[152723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nebbjwsjflozhwnyvepdyhqtateiguju ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838269.1672356-402-4982191512034/AnsiballZ_edpm_nftables_from_files.py
Feb 23 09:17:49 np0005626463.localdomain sudo[152723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:49 np0005626463.localdomain python3[152725]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 09:17:49 np0005626463.localdomain sudo[152723]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:51 np0005626463.localdomain sudo[152815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cezbdtdmkmjmvsgvpjughphzaxxbtiez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838270.7627308-426-46994912217770/AnsiballZ_stat.py
Feb 23 09:17:51 np0005626463.localdomain sudo[152815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:51 np0005626463.localdomain python3.9[152817]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:51 np0005626463.localdomain sudo[152815]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:51 np0005626463.localdomain sudo[152890]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srrctkrcwviuaodxoakwtezzlqantvdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838270.7627308-426-46994912217770/AnsiballZ_copy.py
Feb 23 09:17:51 np0005626463.localdomain sudo[152890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:51 np0005626463.localdomain python3.9[152892]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838270.7627308-426-46994912217770/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:51 np0005626463.localdomain sudo[152890]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:53 np0005626463.localdomain sudo[152982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orwxgndjucrezeeuhcialtqmtijcbrff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838272.660234-471-239901030407009/AnsiballZ_stat.py
Feb 23 09:17:53 np0005626463.localdomain sudo[152982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:53 np0005626463.localdomain python3.9[152984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:53 np0005626463.localdomain sudo[152982]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:53 np0005626463.localdomain sudo[153057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehqifsmbelozobtupohmxiwqtlagposc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838272.660234-471-239901030407009/AnsiballZ_copy.py
Feb 23 09:17:53 np0005626463.localdomain sudo[153057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:53 np0005626463.localdomain python3.9[153059]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838272.660234-471-239901030407009/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:53 np0005626463.localdomain sudo[153057]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14301 DF PROTO=TCP SPT=50292 DPT=9102 SEQ=3909814091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9D8300000000001030307) 
Feb 23 09:17:54 np0005626463.localdomain sudo[153149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htslahueviejlbaxzxivciuvjbqhhnxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838273.916604-516-233012937398515/AnsiballZ_stat.py
Feb 23 09:17:54 np0005626463.localdomain sudo[153149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:54 np0005626463.localdomain python3.9[153151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:54 np0005626463.localdomain sudo[153149]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=935 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9DA070000000001030307) 
Feb 23 09:17:54 np0005626463.localdomain sudo[153224]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaottddbobrrckpykqdnnpcoqidiialv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838273.916604-516-233012937398515/AnsiballZ_copy.py
Feb 23 09:17:54 np0005626463.localdomain sudo[153224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:55 np0005626463.localdomain python3.9[153226]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838273.916604-516-233012937398515/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:55 np0005626463.localdomain sudo[153224]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:55 np0005626463.localdomain sudo[153316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adfarhlthkvlrdhebarqbdpnjsfzocsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838275.1686063-561-164450185047193/AnsiballZ_stat.py
Feb 23 09:17:55 np0005626463.localdomain sudo[153316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:55 np0005626463.localdomain python3.9[153318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:55 np0005626463.localdomain sudo[153316]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:56 np0005626463.localdomain sudo[153391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-faqjjkgkbbiktelpvnblshqicekbuhaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838275.1686063-561-164450185047193/AnsiballZ_copy.py
Feb 23 09:17:56 np0005626463.localdomain sudo[153391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:56 np0005626463.localdomain python3.9[153393]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838275.1686063-561-164450185047193/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:56 np0005626463.localdomain sudo[153391]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:56 np0005626463.localdomain sudo[153483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrfuciuhipnwbjpyjavbrjybqovedpio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838276.594574-606-50614896584458/AnsiballZ_stat.py
Feb 23 09:17:56 np0005626463.localdomain sudo[153483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:57 np0005626463.localdomain python3.9[153485]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:17:57 np0005626463.localdomain sudo[153483]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14303 DF PROTO=TCP SPT=50292 DPT=9102 SEQ=3909814091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9E4460000000001030307) 
Feb 23 09:17:57 np0005626463.localdomain sudo[153558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncrkgylwospscnikfcblqtlzidzqsanh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838276.594574-606-50614896584458/AnsiballZ_copy.py
Feb 23 09:17:57 np0005626463.localdomain sudo[153558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:57 np0005626463.localdomain python3.9[153560]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838276.594574-606-50614896584458/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:57 np0005626463.localdomain sudo[153558]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:58 np0005626463.localdomain sudo[153650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylxcktgfkgazpqdsiglxupqtvbdfejpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838277.8749444-651-167557726961263/AnsiballZ_file.py
Feb 23 09:17:58 np0005626463.localdomain sudo[153650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:58 np0005626463.localdomain python3.9[153652]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:17:58 np0005626463.localdomain sudo[153650]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:58 np0005626463.localdomain sudo[153742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkjzwfnlfeaidqixzstfeeuctvytxrqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838278.5430443-675-18299975168370/AnsiballZ_command.py
Feb 23 09:17:58 np0005626463.localdomain sudo[153742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:17:59 np0005626463.localdomain python3.9[153744]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:17:59 np0005626463.localdomain sudo[153742]: pam_unix(sudo:session): session closed for user root
Feb 23 09:17:59 np0005626463.localdomain sudo[153837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdwhwrygbinfcgkkumsjopkpokqruyao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838279.5298111-699-146799661030032/AnsiballZ_blockinfile.py
Feb 23 09:17:59 np0005626463.localdomain sudo[153837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:00 np0005626463.localdomain python3.9[153839]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:00 np0005626463.localdomain sudo[153837]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35684 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9F2C60000000001030307) 
Feb 23 09:18:01 np0005626463.localdomain sudo[153929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axdqyumuylkzpidcduspkyercansjgfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838280.3688686-726-175457024670161/AnsiballZ_command.py
Feb 23 09:18:01 np0005626463.localdomain sudo[153929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:01 np0005626463.localdomain python3.9[153931]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:01 np0005626463.localdomain sudo[153929]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:02 np0005626463.localdomain sudo[154022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqmehmetankimqdqrqobdchrktsicuft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838281.8097398-750-127763037509851/AnsiballZ_stat.py
Feb 23 09:18:02 np0005626463.localdomain sudo[154022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:02 np0005626463.localdomain python3.9[154024]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:18:02 np0005626463.localdomain sudo[154022]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35685 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9FAC60000000001030307) 
Feb 23 09:18:03 np0005626463.localdomain sudo[154116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdddikyjwemijidkuqczhcomjnmkhrev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838283.266223-774-58255583987328/AnsiballZ_command.py
Feb 23 09:18:03 np0005626463.localdomain sudo[154116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:03 np0005626463.localdomain python3.9[154118]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:03 np0005626463.localdomain sudo[154116]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:04 np0005626463.localdomain sudo[154211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkbnygjdganikqtshtpxpqsrslrlnjdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838284.114224-798-197747937064718/AnsiballZ_file.py
Feb 23 09:18:04 np0005626463.localdomain sudo[154211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:04 np0005626463.localdomain python3.9[154213]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:04 np0005626463.localdomain sudo[154211]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:05 np0005626463.localdomain python3.9[154303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:18:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34093 DF PROTO=TCP SPT=32832 DPT=9101 SEQ=1727785860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA07860000000001030307) 
Feb 23 09:18:06 np0005626463.localdomain sudo[154394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqgbqlethqowxvdntkrykvkdtlvmzcdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838286.6359515-921-74220984205708/AnsiballZ_command.py
Feb 23 09:18:06 np0005626463.localdomain sudo[154394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:07 np0005626463.localdomain python3.9[154396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005626463.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:6e:1d:57:37" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:07 np0005626463.localdomain ovs-vsctl[154397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005626463.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:6e:1d:57:37 external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 23 09:18:07 np0005626463.localdomain sudo[154394]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:07 np0005626463.localdomain sudo[154487]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecltysutfcbrvprcowkwsdskbfqbwzdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838287.3962235-948-276214975355623/AnsiballZ_command.py
Feb 23 09:18:07 np0005626463.localdomain sudo[154487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:08 np0005626463.localdomain python3.9[154489]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:08 np0005626463.localdomain sudo[154487]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:08 np0005626463.localdomain python3.9[154582]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:18:09 np0005626463.localdomain sudo[154599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:18:09 np0005626463.localdomain sudo[154599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:18:09 np0005626463.localdomain sudo[154599]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:09 np0005626463.localdomain sudo[154633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:18:09 np0005626463.localdomain sudo[154633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:18:09 np0005626463.localdomain sudo[154704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkbsidffufsmuuuuholjqxucssqrcoli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838289.0650344-1002-4233424271977/AnsiballZ_file.py
Feb 23 09:18:09 np0005626463.localdomain sudo[154704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62695 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA13E30000000001030307) 
Feb 23 09:18:09 np0005626463.localdomain python3.9[154706]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:09 np0005626463.localdomain sudo[154704]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:09 np0005626463.localdomain sudo[154633]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:10 np0005626463.localdomain sudo[154828]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuemvlnamnscaqsaobvevqzakhcszblx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838289.7439015-1026-65035643252020/AnsiballZ_stat.py
Feb 23 09:18:10 np0005626463.localdomain sudo[154828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:10 np0005626463.localdomain python3.9[154830]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:10 np0005626463.localdomain sudo[154828]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:10 np0005626463.localdomain sudo[154833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:18:10 np0005626463.localdomain sudo[154833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:18:10 np0005626463.localdomain sudo[154833]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:10 np0005626463.localdomain sudo[154891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddmvgxzctxbqwsidtpvqpvitmfszjuoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838289.7439015-1026-65035643252020/AnsiballZ_file.py
Feb 23 09:18:10 np0005626463.localdomain sudo[154891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:10 np0005626463.localdomain python3.9[154893]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:10 np0005626463.localdomain sudo[154891]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:11 np0005626463.localdomain sudo[154983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnrnqurcdyfjxzlfbuptriovusejljtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838290.9229841-1026-20867423785772/AnsiballZ_stat.py
Feb 23 09:18:11 np0005626463.localdomain sudo[154983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:11 np0005626463.localdomain python3.9[154985]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:11 np0005626463.localdomain sudo[154983]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:11 np0005626463.localdomain sshd[154988]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:18:12 np0005626463.localdomain sudo[155033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bufgixrvpyrrhbebifsnpsxgutwlkned ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838290.9229841-1026-20867423785772/AnsiballZ_file.py
Feb 23 09:18:12 np0005626463.localdomain sudo[155033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:12 np0005626463.localdomain python3.9[155035]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:12 np0005626463.localdomain sudo[155033]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:12 np0005626463.localdomain sshd[154988]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:18:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62697 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA20060000000001030307) 
Feb 23 09:18:12 np0005626463.localdomain sudo[155125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpyanncwcepxmbnddhsjtbaxkwjrdunq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838292.4796734-1095-86531334366645/AnsiballZ_file.py
Feb 23 09:18:12 np0005626463.localdomain sudo[155125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:12 np0005626463.localdomain python3.9[155127]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:12 np0005626463.localdomain sudo[155125]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:13 np0005626463.localdomain sudo[155217]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nootsnwhvdcyuphoxpysbakvsfnrnbmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838293.116759-1119-120286347193405/AnsiballZ_stat.py
Feb 23 09:18:13 np0005626463.localdomain sudo[155217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:14 np0005626463.localdomain python3.9[155219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:14 np0005626463.localdomain sudo[155217]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:14 np0005626463.localdomain sudo[155265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrmzaetovgulxzdkqdwbbcrxobzcgxro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838293.116759-1119-120286347193405/AnsiballZ_file.py
Feb 23 09:18:14 np0005626463.localdomain sudo[155265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:14 np0005626463.localdomain python3.9[155267]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:14 np0005626463.localdomain sudo[155265]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35687 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA2A060000000001030307) 
Feb 23 09:18:15 np0005626463.localdomain sudo[155357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgteobwgsmiabbmibhdpdypfgiaucpsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838294.8803594-1155-13815014528802/AnsiballZ_stat.py
Feb 23 09:18:15 np0005626463.localdomain sudo[155357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:15 np0005626463.localdomain python3.9[155359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:15 np0005626463.localdomain sudo[155357]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:15 np0005626463.localdomain sudo[155405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugkbfmcqvrxpcskmpevmhbvohqmabxtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838294.8803594-1155-13815014528802/AnsiballZ_file.py
Feb 23 09:18:15 np0005626463.localdomain sudo[155405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:15 np0005626463.localdomain python3.9[155407]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:15 np0005626463.localdomain sudo[155405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:16 np0005626463.localdomain sudo[155497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvgkqkjanommkaujdixqcsbjwuglnzry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838296.35491-1191-276685863762612/AnsiballZ_systemd.py
Feb 23 09:18:16 np0005626463.localdomain sudo[155497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:16 np0005626463.localdomain python3.9[155499]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:18:16 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:18:17 np0005626463.localdomain systemd-rc-local-generator[155521]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:18:17 np0005626463.localdomain systemd-sysv-generator[155525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:18:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:18:17 np0005626463.localdomain sudo[155497]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:17 np0005626463.localdomain sudo[155626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iltsbiqxhxigjxnjvqzibmdykqcldjaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838297.4368556-1215-53338668213672/AnsiballZ_stat.py
Feb 23 09:18:17 np0005626463.localdomain sudo[155626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:17 np0005626463.localdomain python3.9[155628]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:17 np0005626463.localdomain sudo[155626]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:18 np0005626463.localdomain sudo[155674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sobdgbcptgxbvahlajjlyfiwhrmzttjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838297.4368556-1215-53338668213672/AnsiballZ_file.py
Feb 23 09:18:18 np0005626463.localdomain sudo[155674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:18 np0005626463.localdomain python3.9[155676]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:18 np0005626463.localdomain sudo[155674]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34095 DF PROTO=TCP SPT=32832 DPT=9101 SEQ=1727785860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA38060000000001030307) 
Feb 23 09:18:18 np0005626463.localdomain sudo[155766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqxolsramkdhhlfadqochvziwqbhxiop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838298.532806-1251-144677265487765/AnsiballZ_stat.py
Feb 23 09:18:18 np0005626463.localdomain sudo[155766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:18 np0005626463.localdomain python3.9[155768]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:19 np0005626463.localdomain sudo[155766]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:19 np0005626463.localdomain sudo[155814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xswxocivfcoxcfgvqtadnolnapqgqywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838298.532806-1251-144677265487765/AnsiballZ_file.py
Feb 23 09:18:19 np0005626463.localdomain sudo[155814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:19 np0005626463.localdomain python3.9[155816]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:19 np0005626463.localdomain sudo[155814]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:19 np0005626463.localdomain sshd[155831]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:18:20 np0005626463.localdomain sshd[155831]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:18:20 np0005626463.localdomain sudo[155908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlwpimgnfueixkxexrmbbnlozscwfyct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838300.1992435-1287-29274112265296/AnsiballZ_systemd.py
Feb 23 09:18:20 np0005626463.localdomain sudo[155908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:20 np0005626463.localdomain python3.9[155910]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:18:20 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:18:20 np0005626463.localdomain systemd-rc-local-generator[155933]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:18:20 np0005626463.localdomain systemd-sysv-generator[155940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:18:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:18:21 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 09:18:21 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 09:18:21 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 09:18:21 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 09:18:21 np0005626463.localdomain sudo[155908]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:21 np0005626463.localdomain sudo[156042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gntnyxfmcewmnmznysuvlosmfhacextk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838301.5059965-1317-207775796809858/AnsiballZ_file.py
Feb 23 09:18:21 np0005626463.localdomain sudo[156042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:22 np0005626463.localdomain python3.9[156044]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:22 np0005626463.localdomain sudo[156042]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:23 np0005626463.localdomain sudo[156134]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbyugxfvruadsbhrqxujkfdfkqmkhwkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838302.7522328-1341-27714097638205/AnsiballZ_stat.py
Feb 23 09:18:23 np0005626463.localdomain sudo[156134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:23 np0005626463.localdomain python3.9[156136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:23 np0005626463.localdomain sudo[156134]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:23 np0005626463.localdomain sudo[156207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrmfoysxmktlvxyrktalavnhnmecomof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838302.7522328-1341-27714097638205/AnsiballZ_copy.py
Feb 23 09:18:23 np0005626463.localdomain sudo[156207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:23 np0005626463.localdomain python3.9[156209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838302.7522328-1341-27714097638205/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:23 np0005626463.localdomain sudo[156207]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10955 DF PROTO=TCP SPT=51976 DPT=9102 SEQ=101057569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA4D600000000001030307) 
Feb 23 09:18:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62699 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA50060000000001030307) 
Feb 23 09:18:25 np0005626463.localdomain sudo[156299]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blhkkngvcbskkvysjrekuvwpqssleugl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838304.835696-1392-243864394357832/AnsiballZ_file.py
Feb 23 09:18:25 np0005626463.localdomain sudo[156299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:25 np0005626463.localdomain python3.9[156301]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:25 np0005626463.localdomain sudo[156299]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:25 np0005626463.localdomain sudo[156391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uluaypzcspzavizrkdzhmgffhtquxtgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838305.6765814-1416-212975953410815/AnsiballZ_file.py
Feb 23 09:18:25 np0005626463.localdomain sudo[156391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:26 np0005626463.localdomain python3.9[156393]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:26 np0005626463.localdomain sudo[156391]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:26 np0005626463.localdomain sudo[156483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqelrgamtrkgckebrzajuljdonztekld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838306.3763897-1440-44078644384584/AnsiballZ_stat.py
Feb 23 09:18:26 np0005626463.localdomain sudo[156483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:26 np0005626463.localdomain python3.9[156485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:26 np0005626463.localdomain sudo[156483]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:27 np0005626463.localdomain sudo[156558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqhouljhuhidnpkjjuacqivhijdacqri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838306.3763897-1440-44078644384584/AnsiballZ_copy.py
Feb 23 09:18:27 np0005626463.localdomain sudo[156558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10957 DF PROTO=TCP SPT=51976 DPT=9102 SEQ=101057569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA59870000000001030307) 
Feb 23 09:18:27 np0005626463.localdomain python3.9[156560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838306.3763897-1440-44078644384584/.source.json _original_basename=.kvvmbsge follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:27 np0005626463.localdomain sudo[156558]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:27 np0005626463.localdomain python3.9[156650]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:29 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52734 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA63D60000000001030307) 
Feb 23 09:18:29 np0005626463.localdomain sudo[156901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioxhtexpfqbaywcwsyxtxqgqfqnvslaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838309.4836721-1560-198927807865049/AnsiballZ_container_config_data.py
Feb 23 09:18:29 np0005626463.localdomain sudo[156901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:30 np0005626463.localdomain python3.9[156903]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 23 09:18:30 np0005626463.localdomain sudo[156901]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:30 np0005626463.localdomain sudo[156993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-masmftsetoowoszkxujwgtfnymftigzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838310.5008352-1593-112240640729968/AnsiballZ_container_config_hash.py
Feb 23 09:18:30 np0005626463.localdomain sudo[156993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:31 np0005626463.localdomain python3.9[156995]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:18:31 np0005626463.localdomain sudo[156993]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:32 np0005626463.localdomain sudo[157085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dexouraoxutpohbwqitpjaftaznzhfyd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838311.6726375-1623-277374806289334/AnsiballZ_edpm_container_manage.py
Feb 23 09:18:32 np0005626463.localdomain sudo[157085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:32 np0005626463.localdomain python3[157087]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:18:32 np0005626463.localdomain python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "bfb93be9d83c3121be0312d4d8c02944841d931c726f68b412221913286262d4",
                                                                    "Digest": "sha256:5a01d6902fcff84f31d264784a24433f1266e51e84e70ca3796953855fdec417",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:5a01d6902fcff84f31d264784a24433f1266e51e84e70ca3796953855fdec417"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:34:22.194153324Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 347092937,
                                                                    "VirtualSize": 347092937,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:4488e457e941888ff222080c5c98fc98b827e2e0699d850c0a8b0f12f152d8f5",
                                                                              "sha256:bde1ac8945157434308ea323cfa7054085e8af54598c165ad28f8de2052547eb"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:22.017157713Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:17.588275113Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:19.121893043Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:33:47.07749631Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:34:22.192813043Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:34:24.134905063Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 09:18:32 np0005626463.localdomain podman[157138]: 2026-02-23 09:18:32.843551039 +0000 UTC m=+0.092819704 container remove 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 23 09:18:32 np0005626463.localdomain python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Feb 23 09:18:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52736 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA6FC70000000001030307) 
Feb 23 09:18:32 np0005626463.localdomain podman[157152]: 
Feb 23 09:18:32 np0005626463.localdomain podman[157152]: 2026-02-23 09:18:32.951392399 +0000 UTC m=+0.086312162 container create 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 09:18:32 np0005626463.localdomain podman[157152]: 2026-02-23 09:18:32.911347734 +0000 UTC m=+0.046267517 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 09:18:32 np0005626463.localdomain python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 23 09:18:33 np0005626463.localdomain sudo[157085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:34 np0005626463.localdomain sudo[157278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmqfdkimxavcmxoyctqlbwucudapindy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838313.8987026-1647-75178543319738/AnsiballZ_stat.py
Feb 23 09:18:34 np0005626463.localdomain sudo[157278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:34 np0005626463.localdomain python3.9[157280]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:18:34 np0005626463.localdomain sudo[157278]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:34 np0005626463.localdomain sudo[157372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldgsayquufgcrbjnksueeoeahrzryrrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838314.6746202-1674-126470386906816/AnsiballZ_file.py
Feb 23 09:18:34 np0005626463.localdomain sudo[157372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:35 np0005626463.localdomain python3.9[157374]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:35 np0005626463.localdomain sudo[157372]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:35 np0005626463.localdomain sudo[157418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjgjthsnmrqcqhmthpudmyfekdjwjcfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838314.6746202-1674-126470386906816/AnsiballZ_stat.py
Feb 23 09:18:35 np0005626463.localdomain sudo[157418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:36 np0005626463.localdomain python3.9[157420]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:18:36 np0005626463.localdomain sudo[157418]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32006 DF PROTO=TCP SPT=58964 DPT=9101 SEQ=2434591119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA7CC60000000001030307) 
Feb 23 09:18:36 np0005626463.localdomain sudo[157509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agxkzcncemyfreajstwcympdpezvwmqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838316.128913-1674-140198899223835/AnsiballZ_copy.py
Feb 23 09:18:36 np0005626463.localdomain sudo[157509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:36 np0005626463.localdomain python3.9[157511]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838316.128913-1674-140198899223835/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:36 np0005626463.localdomain sudo[157509]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:37 np0005626463.localdomain sudo[157555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wizognkakvlpsszljcjekgfrcnjecujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838316.128913-1674-140198899223835/AnsiballZ_systemd.py
Feb 23 09:18:37 np0005626463.localdomain sudo[157555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:37 np0005626463.localdomain python3.9[157557]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:18:37 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:18:37 np0005626463.localdomain systemd-rc-local-generator[157580]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:18:37 np0005626463.localdomain systemd-sysv-generator[157583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:18:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:18:37 np0005626463.localdomain sudo[157555]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:37 np0005626463.localdomain sudo[157637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzfhtsugxrxmmlazggwzjdwcpnxkvqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838316.128913-1674-140198899223835/AnsiballZ_systemd.py
Feb 23 09:18:37 np0005626463.localdomain sudo[157637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:38 np0005626463.localdomain python3.9[157639]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:18:38 np0005626463.localdomain systemd-rc-local-generator[157664]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:18:38 np0005626463.localdomain systemd-sysv-generator[157669]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Starting ovn_controller container...
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:18:38 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e81b0c2abad543b5961626b9b2efedf2d0a2337c6f3b40a4800cdd043c8a8213/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:18:38 np0005626463.localdomain podman[157681]: 2026-02-23 09:18:38.809644232 +0000 UTC m=+0.160306519 container init 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, container_name=ovn_controller)
Feb 23 09:18:38 np0005626463.localdomain ovn_controller[157695]: + sudo -E kolla_set_configs
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:18:38 np0005626463.localdomain podman[157681]: 2026-02-23 09:18:38.8462758 +0000 UTC m=+0.196937987 container start 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:18:38 np0005626463.localdomain edpm-start-podman-container[157681]: ovn_controller
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 23 09:18:38 np0005626463.localdomain systemd[157729]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 23 09:18:38 np0005626463.localdomain edpm-start-podman-container[157680]: Creating additional drop-in dependency for "ovn_controller" (83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc)
Feb 23 09:18:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:18:39 np0005626463.localdomain podman[157703]: 2026-02-23 09:18:39.01629944 +0000 UTC m=+0.163833979 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Queued start job for default target Main User Target.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Created slice User Application Slice.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Reached target Paths.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Reached target Timers.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Starting D-Bus User Message Bus Socket...
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Starting Create User's Volatile Files and Directories...
Feb 23 09:18:39 np0005626463.localdomain systemd-rc-local-generator[157782]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:18:39 np0005626463.localdomain podman[157703]: 2026-02-23 09:18:39.056246241 +0000 UTC m=+0.203780760 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Finished Create User's Volatile Files and Directories.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Listening on D-Bus User Message Bus Socket.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Reached target Sockets.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Reached target Basic System.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Reached target Main User Target.
Feb 23 09:18:39 np0005626463.localdomain systemd[157729]: Startup finished in 118ms.
Feb 23 09:18:39 np0005626463.localdomain systemd-sysv-generator[157785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:18:39 np0005626463.localdomain podman[157703]: unhealthy
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: Started User Manager for UID 0.
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Failed with result 'exit-code'.
Feb 23 09:18:39 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 23 09:18:39 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: Started ovn_controller container.
Feb 23 09:18:39 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:18:39 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:18:39 np0005626463.localdomain sudo[157637]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: Started Session c12 of User root.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: INFO:__main__:Validating config file
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: INFO:__main__:Writing out command to execute
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: ++ cat /run_command
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + ARGS=
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + sudo kolla_copy_cacerts
Feb 23 09:18:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21854 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA89130000000001030307) 
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: Started Session c13 of User root.
Feb 23 09:18:39 np0005626463.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + [[ ! -n '' ]]
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + . kolla_extend_start
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + umask 0022
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00013|main|INFO|OVS feature set changed, force recompute.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00021|main|INFO|OVS feature set changed, force recompute.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-5b0126-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-585d62-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-b9c72d-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00026|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00027|binding|INFO|Claiming lport a27e5011-2016-4b16-b5e8-04b555b30bc4 for this chassis.
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00028|binding|INFO|a27e5011-2016-4b16-b5e8-04b555b30bc4: Claiming fa:16:3e:a0:9d:00 192.168.0.12
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00029|binding|INFO|Removing lport a27e5011-2016-4b16-b5e8-04b555b30bc4 ovn-installed in OVS
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00030|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-5b0126-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-585d62-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00033|ovn_bfd|INFO|Enabled BFD on interface ovn-b9c72d-0
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00034|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00035|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00036|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00037|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:39Z|00038|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:40Z|00039|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:40Z|00040|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:40Z|00041|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:40Z|00042|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:18:41 np0005626463.localdomain python3.9[157895]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:18:41 np0005626463.localdomain sudo[157985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buvopkmhylkbdkuzwuawovlswjxsssls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838321.7459009-1809-270329035118453/AnsiballZ_stat.py
Feb 23 09:18:42 np0005626463.localdomain sudo[157985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:42 np0005626463.localdomain python3.9[157987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:18:42 np0005626463.localdomain sudo[157985]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21856 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA95060000000001030307) 
Feb 23 09:18:42 np0005626463.localdomain sudo[158058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjfvucjswloyiuvuysgmdkohlcosgnrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838321.7459009-1809-270329035118453/AnsiballZ_copy.py
Feb 23 09:18:42 np0005626463.localdomain sudo[158058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:42 np0005626463.localdomain python3.9[158060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838321.7459009-1809-270329035118453/.source.yaml _original_basename=.zyhhtvo1 follow=False checksum=181037f60084fed8e752a93376456c5747d0788c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:18:42 np0005626463.localdomain sudo[158058]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:42 np0005626463.localdomain sshd[158075]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:18:43 np0005626463.localdomain sudo[158151]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-katdyhubgqunyuvznzftocykmuqmcelg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838323.0840073-1854-189081326217272/AnsiballZ_command.py
Feb 23 09:18:43 np0005626463.localdomain sudo[158151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:43 np0005626463.localdomain python3.9[158153]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:43 np0005626463.localdomain ovs-vsctl[158155]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 23 09:18:43 np0005626463.localdomain sudo[158151]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:44 np0005626463.localdomain sudo[158245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkepmqcrkaufwqkcwdoxfetuxykrkurg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838323.7635567-1878-272317981602922/AnsiballZ_command.py
Feb 23 09:18:44 np0005626463.localdomain sudo[158245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:45 np0005626463.localdomain python3.9[158247]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:45 np0005626463.localdomain ovs-vsctl[158249]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 23 09:18:45 np0005626463.localdomain sudo[158245]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52738 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAA0060000000001030307) 
Feb 23 09:18:45 np0005626463.localdomain sudo[158340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkfnkrqzztfgtfmhidtqcrpalkpgcdqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838325.5197177-1920-225122117485011/AnsiballZ_command.py
Feb 23 09:18:45 np0005626463.localdomain sudo[158340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:45 np0005626463.localdomain sshd[158075]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:18:46 np0005626463.localdomain python3.9[158342]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:18:46 np0005626463.localdomain ovs-vsctl[158343]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 23 09:18:46 np0005626463.localdomain sudo[158340]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:47 np0005626463.localdomain sshd[151226]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:18:47 np0005626463.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Feb 23 09:18:47 np0005626463.localdomain systemd-logind[759]: Session 49 logged out. Waiting for processes to exit.
Feb 23 09:18:47 np0005626463.localdomain systemd[1]: session-49.scope: Consumed 42.965s CPU time.
Feb 23 09:18:47 np0005626463.localdomain systemd-logind[759]: Removed session 49.
Feb 23 09:18:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:47Z|00043|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 ovn-installed in OVS
Feb 23 09:18:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:18:47Z|00044|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 up in Southbound
Feb 23 09:18:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32008 DF PROTO=TCP SPT=58964 DPT=9101 SEQ=2434591119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAAC060000000001030307) 
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Activating special unit Exit the Session...
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped target Main User Target.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped target Basic System.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped target Paths.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped target Sockets.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped target Timers.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Closed D-Bus User Message Bus Socket.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Stopped Create User's Volatile Files and Directories.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Removed slice User Application Slice.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Reached target Shutdown.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Finished Exit the Session.
Feb 23 09:18:49 np0005626463.localdomain systemd[157729]: Reached target Exit the Session.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 23 09:18:49 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 23 09:18:52 np0005626463.localdomain sshd[158360]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:18:52 np0005626463.localdomain sshd[158360]: Accepted publickey for zuul from 192.168.122.30 port 49830 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:18:52 np0005626463.localdomain systemd-logind[759]: New session 51 of user zuul.
Feb 23 09:18:52 np0005626463.localdomain systemd[1]: Started Session 51 of User zuul.
Feb 23 09:18:52 np0005626463.localdomain sshd[158360]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:18:53 np0005626463.localdomain python3.9[158453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:18:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17606 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAC2900000000001030307) 
Feb 23 09:18:54 np0005626463.localdomain sudo[158547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxgdcvyrqhoejdhitiyxthmxnechllex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838333.8885257-57-114918697208855/AnsiballZ_file.py
Feb 23 09:18:54 np0005626463.localdomain sudo[158547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:54 np0005626463.localdomain python3.9[158549]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:54 np0005626463.localdomain sudo[158547]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21858 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAC6070000000001030307) 
Feb 23 09:18:55 np0005626463.localdomain sudo[158639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzznzrbuivoroxcqdunypsmnpxyvavdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838334.6436293-57-75272490190972/AnsiballZ_file.py
Feb 23 09:18:55 np0005626463.localdomain sudo[158639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:55 np0005626463.localdomain python3.9[158641]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:55 np0005626463.localdomain sudo[158639]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:56 np0005626463.localdomain sudo[158731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpllcbplipduzveydpockscbgcpwhftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838336.0886755-57-91482801500742/AnsiballZ_file.py
Feb 23 09:18:56 np0005626463.localdomain sudo[158731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:56 np0005626463.localdomain python3.9[158733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:56 np0005626463.localdomain sudo[158731]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17608 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEACE860000000001030307) 
Feb 23 09:18:57 np0005626463.localdomain sudo[158823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztrfnibnepoccexgjelcltqufhhmmeau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838336.6642194-57-134268553815707/AnsiballZ_file.py
Feb 23 09:18:57 np0005626463.localdomain sudo[158823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:57 np0005626463.localdomain python3.9[158826]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:57 np0005626463.localdomain sudo[158823]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:58 np0005626463.localdomain sudo[158916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvwdzqqczopzwsopvwplrqsaoechjnxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838337.8149583-57-168586832184007/AnsiballZ_file.py
Feb 23 09:18:58 np0005626463.localdomain sudo[158916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:58 np0005626463.localdomain python3.9[158918]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:18:58 np0005626463.localdomain sudo[158916]: pam_unix(sudo:session): session closed for user root
Feb 23 09:18:58 np0005626463.localdomain python3.9[159008]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:18:59 np0005626463.localdomain sudo[159098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyubrnizmasnejsruzwhrmfsjjtgttrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838339.1712153-189-123487275467096/AnsiballZ_seboolean.py
Feb 23 09:18:59 np0005626463.localdomain sudo[159098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:18:59 np0005626463.localdomain python3.9[159100]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 23 09:18:59 np0005626463.localdomain sshd[159101]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:19:00 np0005626463.localdomain sudo[159098]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:00 np0005626463.localdomain sshd[159101]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:19:00 np0005626463.localdomain python3.9[159192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29948 DF PROTO=TCP SPT=47350 DPT=9100 SEQ=1493365353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEADD070000000001030307) 
Feb 23 09:19:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:19:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 09:19:01 np0005626463.localdomain python3.9[159265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838340.1822007-213-231750768156571/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:02 np0005626463.localdomain python3.9[159355]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:02 np0005626463.localdomain python3.9[159428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838341.627566-258-219189195936682/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29949 DF PROTO=TCP SPT=47350 DPT=9100 SEQ=1493365353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAE5060000000001030307) 
Feb 23 09:19:03 np0005626463.localdomain sudo[159518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctcewviqnoqwxrmekfxqtwnhfsbnufqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838342.9233491-309-101383665018570/AnsiballZ_setup.py
Feb 23 09:19:03 np0005626463.localdomain sudo[159518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:03 np0005626463.localdomain python3.9[159520]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:19:03 np0005626463.localdomain sudo[159518]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:04 np0005626463.localdomain sudo[159572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obyrqbldrbjifsasxsirjyfwcbtfogtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838342.9233491-309-101383665018570/AnsiballZ_dnf.py
Feb 23 09:19:04 np0005626463.localdomain sudo[159572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:04 np0005626463.localdomain python3.9[159574]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:19:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21914 DF PROTO=TCP SPT=36452 DPT=9101 SEQ=3026775186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAF2060000000001030307) 
Feb 23 09:19:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:19:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b56105610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 23 09:19:07 np0005626463.localdomain sudo[159572]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:08 np0005626463.localdomain sudo[159666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaqzozsrufxyqqlofbipqxjntobgbzkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838347.964403-345-118060543465489/AnsiballZ_systemd.py
Feb 23 09:19:08 np0005626463.localdomain sudo[159666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:09 np0005626463.localdomain python3.9[159668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:19:09 np0005626463.localdomain sudo[159666]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17610 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAFE060000000001030307) 
Feb 23 09:19:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:19:09 np0005626463.localdomain systemd[1]: tmp-crun.27XSTS.mount: Deactivated successfully.
Feb 23 09:19:09 np0005626463.localdomain podman[159686]: 2026-02-23 09:19:09.913972698 +0000 UTC m=+0.089621934 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:19:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:19:09Z|00045|memory|INFO|18092 kB peak resident set size after 30.5 seconds
Feb 23 09:19:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:19:09Z|00046|memory|INFO|idl-cells-OVN_Southbound:4072 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:80 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:348 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:157 ofctrl_installed_flow_usage-KB:114 ofctrl_sb_flow_ref_usage-KB:68
Feb 23 09:19:09 np0005626463.localdomain podman[159686]: 2026-02-23 09:19:09.960539945 +0000 UTC m=+0.136189201 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_controller)
Feb 23 09:19:09 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:19:10 np0005626463.localdomain sudo[159758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:19:10 np0005626463.localdomain sudo[159758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:19:10 np0005626463.localdomain sudo[159758]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:10 np0005626463.localdomain sudo[159801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:19:10 np0005626463.localdomain sudo[159801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:19:10 np0005626463.localdomain python3.9[159800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:11 np0005626463.localdomain python3.9[159894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838350.370946-369-83221992807801/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:11 np0005626463.localdomain sudo[159801]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:11 np0005626463.localdomain python3.9[160009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:12 np0005626463.localdomain sudo[160050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:19:12 np0005626463.localdomain sudo[160050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:19:12 np0005626463.localdomain sudo[160050]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:12 np0005626463.localdomain python3.9[160095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838351.3380702-369-88403242977912/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56548 DF PROTO=TCP SPT=49832 DPT=9882 SEQ=196626078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB0A470000000001030307) 
Feb 23 09:19:13 np0005626463.localdomain python3.9[160185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:13 np0005626463.localdomain python3.9[160256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838353.0235553-501-270861747460134/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:14 np0005626463.localdomain python3.9[160346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31617 DF PROTO=TCP SPT=53314 DPT=9105 SEQ=1808066821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB14060000000001030307) 
Feb 23 09:19:15 np0005626463.localdomain python3.9[160417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838354.1198308-501-137196997278846/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:15 np0005626463.localdomain python3.9[160507]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:19:16 np0005626463.localdomain sudo[160600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mejrqtavfiopyafwfesatuhhclrbdujg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838355.9962814-615-189362201157882/AnsiballZ_file.py
Feb 23 09:19:16 np0005626463.localdomain sudo[160600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:16 np0005626463.localdomain python3.9[160602]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:16 np0005626463.localdomain sudo[160600]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:16 np0005626463.localdomain sudo[160692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqpbcbrvohpqeaselxgjxhlakpzbgmsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838356.6728842-639-109019618330149/AnsiballZ_stat.py
Feb 23 09:19:16 np0005626463.localdomain sudo[160692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:17 np0005626463.localdomain python3.9[160694]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:17 np0005626463.localdomain sudo[160692]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:17 np0005626463.localdomain sudo[160740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeatuukfeblaczedhzqoubsbuqjsvbvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838356.6728842-639-109019618330149/AnsiballZ_file.py
Feb 23 09:19:17 np0005626463.localdomain sudo[160740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:17 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:19:17Z|00047|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 23 09:19:17 np0005626463.localdomain python3.9[160742]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:17 np0005626463.localdomain sudo[160740]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:17 np0005626463.localdomain sudo[160832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kisficqcahmggyqivoyppikyzhqhrgyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838357.7267609-639-230307019941772/AnsiballZ_stat.py
Feb 23 09:19:17 np0005626463.localdomain sudo[160832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:18 np0005626463.localdomain python3.9[160834]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:18 np0005626463.localdomain sudo[160832]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21916 DF PROTO=TCP SPT=36452 DPT=9101 SEQ=3026775186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB22060000000001030307) 
Feb 23 09:19:18 np0005626463.localdomain sudo[160880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhdnolpkwoopashdcuysezobopdqrrrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838357.7267609-639-230307019941772/AnsiballZ_file.py
Feb 23 09:19:18 np0005626463.localdomain sudo[160880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:19 np0005626463.localdomain python3.9[160882]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:19 np0005626463.localdomain sudo[160880]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:19 np0005626463.localdomain sudo[160972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcvufuyvrrjsxybkyyrfjvahklmjzjnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838359.4022539-708-46965214329123/AnsiballZ_file.py
Feb 23 09:19:19 np0005626463.localdomain sudo[160972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:19 np0005626463.localdomain python3.9[160974]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:19 np0005626463.localdomain sudo[160972]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:21 np0005626463.localdomain sudo[161064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqmjwbzodcsqmqqsdxfsrexctqvvqjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838360.2841754-732-192661979539220/AnsiballZ_stat.py
Feb 23 09:19:21 np0005626463.localdomain sudo[161064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:21 np0005626463.localdomain python3.9[161066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:21 np0005626463.localdomain sudo[161064]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:22 np0005626463.localdomain sudo[161112]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvgdwrpmbfqshpwhanmfwfbzqqsdlbee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838360.2841754-732-192661979539220/AnsiballZ_file.py
Feb 23 09:19:22 np0005626463.localdomain sudo[161112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:22 np0005626463.localdomain python3.9[161114]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:22 np0005626463.localdomain sudo[161112]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:22 np0005626463.localdomain sudo[161204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdundgasvcfxjntnhtzyvcbpntjumyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838362.4904807-768-15292858146963/AnsiballZ_stat.py
Feb 23 09:19:22 np0005626463.localdomain sudo[161204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:22 np0005626463.localdomain python3.9[161206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:22 np0005626463.localdomain sudo[161204]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:23 np0005626463.localdomain sshd[161236]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:19:23 np0005626463.localdomain sudo[161253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixdsjrltdmzfofyiharlmgluhjvicodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838362.4904807-768-15292858146963/AnsiballZ_file.py
Feb 23 09:19:23 np0005626463.localdomain sudo[161253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:23 np0005626463.localdomain python3.9[161255]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:23 np0005626463.localdomain sudo[161253]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:23 np0005626463.localdomain sshd[161236]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:19:23 np0005626463.localdomain sudo[161346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tavpxkvhfgrzuymdfqtzlzqrzodxehig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838363.6514945-804-17360140886241/AnsiballZ_systemd.py
Feb 23 09:19:23 np0005626463.localdomain sudo[161346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59809 DF PROTO=TCP SPT=41464 DPT=9102 SEQ=2877943873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB37C00000000001030307) 
Feb 23 09:19:24 np0005626463.localdomain python3.9[161348]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:19:24 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:19:24 np0005626463.localdomain systemd-rc-local-generator[161373]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:19:24 np0005626463.localdomain systemd-sysv-generator[161377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:19:24 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:19:24 np0005626463.localdomain sudo[161346]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56550 DF PROTO=TCP SPT=49832 DPT=9882 SEQ=196626078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB3A070000000001030307) 
Feb 23 09:19:26 np0005626463.localdomain sudo[161476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sazpcynvuwhtdflquqtutjfwmegxzovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838366.2444963-828-171553156727534/AnsiballZ_stat.py
Feb 23 09:19:26 np0005626463.localdomain sudo[161476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:26 np0005626463.localdomain python3.9[161478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:26 np0005626463.localdomain sudo[161476]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:26 np0005626463.localdomain sudo[161524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qopqphgnahdggzaonfnmvmcrfuccajgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838366.2444963-828-171553156727534/AnsiballZ_file.py
Feb 23 09:19:26 np0005626463.localdomain sudo[161524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59811 DF PROTO=TCP SPT=41464 DPT=9102 SEQ=2877943873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB43C60000000001030307) 
Feb 23 09:19:27 np0005626463.localdomain python3.9[161526]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:27 np0005626463.localdomain sudo[161524]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:27 np0005626463.localdomain sudo[161616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dddukpqevrqtqeddkbnvbsugncnxmlax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838367.4188669-864-3183391029839/AnsiballZ_stat.py
Feb 23 09:19:27 np0005626463.localdomain sudo[161616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:27 np0005626463.localdomain python3.9[161618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:27 np0005626463.localdomain sudo[161616]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:28 np0005626463.localdomain sudo[161664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmnrosvuifyxnuvclyxzzgcvfdetpboo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838367.4188669-864-3183391029839/AnsiballZ_file.py
Feb 23 09:19:28 np0005626463.localdomain sudo[161664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:28 np0005626463.localdomain python3.9[161666]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:28 np0005626463.localdomain sudo[161664]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:28 np0005626463.localdomain sudo[161756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhmjnqruopdvbevtoiylsvvotmrbokvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838368.4851985-900-53000806810800/AnsiballZ_systemd.py
Feb 23 09:19:28 np0005626463.localdomain sudo[161756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:29 np0005626463.localdomain python3.9[161758]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:19:29 np0005626463.localdomain systemd-rc-local-generator[161781]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:19:29 np0005626463.localdomain systemd-sysv-generator[161784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 09:19:29 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 09:19:29 np0005626463.localdomain sudo[161756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:30 np0005626463.localdomain sudo[161889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkfcnzfgovpesjaxndpiiutxnuyyxnpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838369.8407843-930-98503422483749/AnsiballZ_file.py
Feb 23 09:19:30 np0005626463.localdomain sudo[161889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:30 np0005626463.localdomain python3.9[161891]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:30 np0005626463.localdomain sudo[161889]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20706 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB52460000000001030307) 
Feb 23 09:19:31 np0005626463.localdomain sudo[161981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwhovojjvmtzgfflcrkmqttdxfejzwgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838371.558441-954-101878687577726/AnsiballZ_stat.py
Feb 23 09:19:31 np0005626463.localdomain sudo[161981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:32 np0005626463.localdomain python3.9[161983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:32 np0005626463.localdomain sudo[161981]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:32 np0005626463.localdomain sudo[162054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nodrfyxqxarhvcuzlykybythzvkivpvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838371.558441-954-101878687577726/AnsiballZ_copy.py
Feb 23 09:19:32 np0005626463.localdomain sudo[162054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:32 np0005626463.localdomain python3.9[162056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838371.558441-954-101878687577726/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:32 np0005626463.localdomain sudo[162054]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20707 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB5A460000000001030307) 
Feb 23 09:19:33 np0005626463.localdomain sudo[162146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmixlhocklsshusnyaddjiglzvyiukcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838372.9007816-1005-246851496680578/AnsiballZ_file.py
Feb 23 09:19:33 np0005626463.localdomain sudo[162146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:34 np0005626463.localdomain python3.9[162148]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:34 np0005626463.localdomain sudo[162146]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:34 np0005626463.localdomain sudo[162238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zheqlyzdmewuwheseszfnzfqxlnarxlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838374.324383-1029-87685947110999/AnsiballZ_file.py
Feb 23 09:19:34 np0005626463.localdomain sudo[162238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:34 np0005626463.localdomain python3.9[162240]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:19:34 np0005626463.localdomain sudo[162238]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:35 np0005626463.localdomain sudo[162330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olfigyqjolwsmennciobwzssonrbbyxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838375.0409348-1053-130955197809679/AnsiballZ_stat.py
Feb 23 09:19:35 np0005626463.localdomain sudo[162330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:35 np0005626463.localdomain python3.9[162332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:35 np0005626463.localdomain sudo[162330]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:35 np0005626463.localdomain sudo[162405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckdxyaklyzjajnukfzwylvyctostzvmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838375.0409348-1053-130955197809679/AnsiballZ_copy.py
Feb 23 09:19:35 np0005626463.localdomain sudo[162405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:36 np0005626463.localdomain python3.9[162407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838375.0409348-1053-130955197809679/.source.json _original_basename=.2fp_r1fz follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:36 np0005626463.localdomain sudo[162405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13589 DF PROTO=TCP SPT=57966 DPT=9101 SEQ=2521930306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB67060000000001030307) 
Feb 23 09:19:36 np0005626463.localdomain python3.9[162497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:38 np0005626463.localdomain sshd[162705]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:19:38 np0005626463.localdomain sudo[162750]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxxnyuifgmjgqlyexjiitfnikymywkcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838378.2632678-1173-83566593728847/AnsiballZ_container_config_data.py
Feb 23 09:19:38 np0005626463.localdomain sudo[162750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:38 np0005626463.localdomain sshd[162705]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:19:38 np0005626463.localdomain python3.9[162752]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 23 09:19:38 np0005626463.localdomain sudo[162750]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20427 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB73730000000001030307) 
Feb 23 09:19:39 np0005626463.localdomain sudo[162842]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iixkocmbsrxfzvydvpcwavhwovzrwbry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838379.3454392-1206-168817007713383/AnsiballZ_container_config_hash.py
Feb 23 09:19:39 np0005626463.localdomain sudo[162842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:39 np0005626463.localdomain python3.9[162844]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:19:39 np0005626463.localdomain sudo[162842]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:40 np0005626463.localdomain sudo[162934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyyossfzhfdxnjzgkpnethiadbkruiep ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838380.2708085-1236-237387100069352/AnsiballZ_edpm_container_manage.py
Feb 23 09:19:40 np0005626463.localdomain sudo[162934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:19:40 np0005626463.localdomain podman[162937]: 2026-02-23 09:19:40.831510052 +0000 UTC m=+0.093288544 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:19:40 np0005626463.localdomain podman[162937]: 2026-02-23 09:19:40.895304374 +0000 UTC m=+0.157082756 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:19:40 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:19:40 np0005626463.localdomain python3[162936]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:19:41 np0005626463.localdomain python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17",
                                                                    "Digest": "sha256:0a8901bdd982c4ba62e40905edf375097daf8fd968b1839b56832f37354d5b07",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:0a8901bdd982c4ba62e40905edf375097daf8fd968b1839b56832f37354d5b07"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:26:05.098634295Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784228609,
                                                                    "VirtualSize": 784228609,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4ca138c1babff33aa47b0f593cc672ab03770d4205069570de2d0e7691f07ed3/diff:/var/lib/containers/storage/overlay/7a6a75b4bc44910de031f240cbd770d29244a190eb01a1840ff2078eb2d894ad/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",
                                                                              "sha256:28e68e9ecec07805a02cd85d7efe631108e3186cd82263ab9cb109564a3435f5",
                                                                              "sha256:2c8b50875d9f0980f38972811e1dbbc8e64c448e40a8be21ff8837be00cf89ab",
                                                                              "sha256:2782735a76d8db3e6692125b10fd55ced9f8590ef8ae6abf986ddc10f33757f4"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:21.746093163Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:58.628150825Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:01.105956567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:03.256657678Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:14.397571992Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:23.060324013Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:23.891247095Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:25.131981134Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:35.376262471Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:24:28.011250524Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:25:23.458316985Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:25:26.6063361Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:25:29.529663125Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:05.09680033Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:05.096888041Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:07.480297847Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 09:19:41 np0005626463.localdomain podman[163010]: 2026-02-23 09:19:41.35459389 +0000 UTC m=+0.070773118 container remove 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 23 09:19:41 np0005626463.localdomain python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Feb 23 09:19:41 np0005626463.localdomain podman[163023]: 
Feb 23 09:19:41 np0005626463.localdomain podman[163023]: 2026-02-23 09:19:41.483412941 +0000 UTC m=+0.109285698 container create 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:19:41 np0005626463.localdomain podman[163023]: 2026-02-23 09:19:41.418568147 +0000 UTC m=+0.044440964 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 09:19:41 np0005626463.localdomain python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 09:19:41 np0005626463.localdomain sudo[162934]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:42 np0005626463.localdomain sudo[163150]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhoklnoqoytzodairjkihumjkngnozig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838381.8444314-1260-6228333252500/AnsiballZ_stat.py
Feb 23 09:19:42 np0005626463.localdomain sudo[163150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:42 np0005626463.localdomain python3.9[163152]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:19:42 np0005626463.localdomain sudo[163150]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20429 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB7F860000000001030307) 
Feb 23 09:19:42 np0005626463.localdomain sudo[163244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsgjcofvtwxangsstuzcrtrdzmtlfxla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838382.6010482-1287-74502441983914/AnsiballZ_file.py
Feb 23 09:19:42 np0005626463.localdomain sudo[163244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:43 np0005626463.localdomain python3.9[163246]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:43 np0005626463.localdomain sudo[163244]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:43 np0005626463.localdomain sudo[163290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avzgzvqngsgxnrvglyprufznydrrmtdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838382.6010482-1287-74502441983914/AnsiballZ_stat.py
Feb 23 09:19:43 np0005626463.localdomain sudo[163290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:44 np0005626463.localdomain python3.9[163292]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:19:44 np0005626463.localdomain sudo[163290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:44 np0005626463.localdomain sudo[163381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qccpqqrchojreisnssvogwptbzldsxqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838384.0895164-1287-153800842458946/AnsiballZ_copy.py
Feb 23 09:19:44 np0005626463.localdomain sudo[163381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:44 np0005626463.localdomain python3.9[163383]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838384.0895164-1287-153800842458946/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:44 np0005626463.localdomain sudo[163381]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:45 np0005626463.localdomain sudo[163427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsrtayrxuttzbzlpbqhmwbtnhtqmzlrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838384.0895164-1287-153800842458946/AnsiballZ_systemd.py
Feb 23 09:19:45 np0005626463.localdomain sudo[163427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20709 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB8A060000000001030307) 
Feb 23 09:19:45 np0005626463.localdomain python3.9[163429]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:19:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:19:45 np0005626463.localdomain systemd-rc-local-generator[163451]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:19:45 np0005626463.localdomain systemd-sysv-generator[163455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:19:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:19:45 np0005626463.localdomain sudo[163427]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:45 np0005626463.localdomain sudo[163509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usvvmvhaurrvtxblpxypleehmefvodni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838384.0895164-1287-153800842458946/AnsiballZ_systemd.py
Feb 23 09:19:45 np0005626463.localdomain sudo[163509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:46 np0005626463.localdomain python3.9[163511]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:19:46 np0005626463.localdomain systemd-rc-local-generator[163539]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:19:46 np0005626463.localdomain systemd-sysv-generator[163542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:19:46 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee1f92392988c110dc098475c8a3c73d1754101417a8a753bd7eeb8a3f0fc5f9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 09:19:46 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee1f92392988c110dc098475c8a3c73d1754101417a8a753bd7eeb8a3f0fc5f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:19:46 np0005626463.localdomain podman[163553]: 2026-02-23 09:19:46.855566102 +0000 UTC m=+0.179036515 container init 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + sudo -E kolla_set_configs
Feb 23 09:19:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:19:46 np0005626463.localdomain podman[163553]: 2026-02-23 09:19:46.914511433 +0000 UTC m=+0.237981856 container start 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:19:46 np0005626463.localdomain edpm-start-podman-container[163553]: ovn_metadata_agent
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Validating config file
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Copying service configuration files
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Writing out command to execute
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: ++ cat /run_command
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + CMD=neutron-ovn-metadata-agent
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + ARGS=
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + sudo kolla_copy_cacerts
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + [[ ! -n '' ]]
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + . kolla_extend_start
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: Running command: 'neutron-ovn-metadata-agent'
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + umask 0022
Feb 23 09:19:46 np0005626463.localdomain ovn_metadata_agent[163567]: + exec neutron-ovn-metadata-agent
Feb 23 09:19:47 np0005626463.localdomain podman[163575]: 2026-02-23 09:19:46.986049404 +0000 UTC m=+0.086230575 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 23 09:19:47 np0005626463.localdomain edpm-start-podman-container[163552]: Creating additional drop-in dependency for "ovn_metadata_agent" (11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739)
Feb 23 09:19:47 np0005626463.localdomain podman[163575]: 2026-02-23 09:19:47.083208737 +0000 UTC m=+0.183389908 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 23 09:19:47 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:19:47 np0005626463.localdomain systemd-rc-local-generator[163637]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:19:47 np0005626463.localdomain systemd-sysv-generator[163643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:19:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:19:47 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:19:47 np0005626463.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 23 09:19:47 np0005626463.localdomain sudo[163509]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 INFO neutron.common.config [-] Logging enabled!
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.518 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.518 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.571 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.571 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.586 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 96b5bb93-7341-4ce6-9b93-6a5de566c711 (UUID: 96b5bb93-7341-4ce6-9b93-6a5de566c711) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.601 163572 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.604 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.608 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.615 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:9d:00 192.168.0.12'], port_security=['fa:16:3e:a0:9d:00 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005626463.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '18508c14-7c5f-4fc2-8d9a-66df41a4ab8c ef2f14d6-40b1-49a6-83d1-89d52b525905', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1694950-12d2-4254-85f1-37700098294d, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a27e5011-2016-4b16-b5e8-04b555b30bc4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.616 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '96b5bb93-7341-4ce6-9b93-6a5de566c711'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], external_ids={'neutron:ovn-metadata-id': '1e311e03-9c9b-56ee-88f3-50a2fe78fcac', 'neutron:ovn-metadata-sb-cfg': '2'}, name=96b5bb93-7341-4ce6-9b93-6a5de566c711, nb_cfg_timestamp=1771838328073, nb_cfg=5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.616 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a27e5011-2016-4b16-b5e8-04b555b30bc4 in datapath 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d bound to our chassis on insert
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f808c024ca0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.618 163572 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.618 163572 INFO oslo_service.service [-] Starting 1 workers
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.620 163572 DEBUG oslo_service.service [-] Started child 163670 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.623 163572 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.624 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpckuxnm6t/privsep.sock']
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.625 163670 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-184385'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.651 163670 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.652 163670 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.652 163670 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.657 163670 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.659 163670 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 23 09:19:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:48.670 163670 INFO eventlet.wsgi.server [-] (163670) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 23 09:19:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13591 DF PROTO=TCP SPT=57966 DPT=9101 SEQ=2521930306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB98060000000001030307) 
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.193 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.194 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpckuxnm6t/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.089 163675 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.094 163675 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.097 163675 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.097 163675 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163675
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.197 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[82ff1654-b311-4176-be73-37665b2ba583]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:19:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:19:49 np0005626463.localdomain python3.9[163755]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.145 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[74c77b82-e97c-4f54-afd9-f6f1283dd6a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.147 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmprtzwo8ob/privsep.sock']
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.757 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.758 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprtzwo8ob/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.643 163808 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.650 163808 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.653 163808 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.653 163808 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163808
Feb 23 09:19:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:50.761 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6dcb06-42aa-415c-973e-1a05cd6ea9c4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:50 np0005626463.localdomain sudo[163855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voajvurjmjkxajwpvpndfwivhuzujzbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838390.5905259-1422-136039156239963/AnsiballZ_stat.py
Feb 23 09:19:50 np0005626463.localdomain sudo[163855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:51 np0005626463.localdomain python3.9[163857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:19:51 np0005626463.localdomain sudo[163855]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.220 163808 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.221 163808 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.221 163808 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:19:51 np0005626463.localdomain sudo[163931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvpsocghrfqebiquwgznrinulygprqaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838390.5905259-1422-136039156239963/AnsiballZ_copy.py
Feb 23 09:19:51 np0005626463.localdomain sudo[163931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:19:51 np0005626463.localdomain python3.9[163933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838390.5905259-1422-136039156239963/.source.yaml _original_basename=.982kglsr follow=False checksum=5e9d1f3425ea21486875902a84faa4fb54cf7178 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:19:51 np0005626463.localdomain sudo[163931]: pam_unix(sudo:session): session closed for user root
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.719 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[e07ebd5b-468f-4b4f-aab0-9c28dd8d9383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.722 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cf58e5-8fe7-456a-999d-9890843db5dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.744 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[6424b2f5-9605-43a7-80e5-98dd66d6f7de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.763 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2c6a0a-c4f3-46d6-ab52-39edfc07e4cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da5b53d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c8:0e:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 643129, 'reachable_time': 21618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 163953, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.779 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[cf130fa9-b99d-40fb-bb8e-03a72ce06639]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9da5b53d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643138, 'tstamp': 643138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap9da5b53d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643140, 'tstamp': 643140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643136, 'tstamp': 643136}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:e6f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643129, 'tstamp': 643129}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.835 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[82b4f5d2-e9a2-4b70-bb61-9e4f17c472cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.837 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da5b53d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.879 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9da5b53d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.880 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.880 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9da5b53d-30, col_values=(('external_ids', {'iface-id': '4143c8ea-7577-4792-9744-bcff90eb20f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.881 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:19:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:51.885 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8829ngjy/privsep.sock']
Feb 23 09:19:51 np0005626463.localdomain sshd[158360]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:19:51 np0005626463.localdomain systemd[1]: session-51.scope: Deactivated successfully.
Feb 23 09:19:52 np0005626463.localdomain systemd[1]: session-51.scope: Consumed 33.005s CPU time.
Feb 23 09:19:52 np0005626463.localdomain systemd-logind[759]: Session 51 logged out. Waiting for processes to exit.
Feb 23 09:19:52 np0005626463.localdomain systemd-logind[759]: Removed session 51.
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.517 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.518 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8829ngjy/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.415 163964 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.421 163964 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.424 163964 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.425 163964 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163964
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.522 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[75176f92-edb7-421a-9017-c446b8c74dad]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.912 163964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.913 163964 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:19:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:52.913 163964 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.359 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[c8029fb2-f7c3-4ea1-ba1e-5468d89c8f5a]: (4, ['ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.363 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, column=external_ids, values=({'neutron:ovn-metadata-id': '1e311e03-9c9b-56ee-88f3-50a2fe78fcac'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.363 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.364 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.401 163572 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.401 163572 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:19:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:19:53.437 163572 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:19:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56263 DF PROTO=TCP SPT=50442 DPT=9102 SEQ=1965221192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBACF00000000001030307) 
Feb 23 09:19:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20431 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBB0060000000001030307) 
Feb 23 09:19:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56265 DF PROTO=TCP SPT=50442 DPT=9102 SEQ=1965221192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBB9460000000001030307) 
Feb 23 09:19:58 np0005626463.localdomain sshd[163969]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:19:58 np0005626463.localdomain sshd[163969]: Accepted publickey for zuul from 192.168.122.30 port 57966 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:19:58 np0005626463.localdomain systemd-logind[759]: New session 52 of user zuul.
Feb 23 09:19:58 np0005626463.localdomain systemd[1]: Started Session 52 of User zuul.
Feb 23 09:19:58 np0005626463.localdomain sshd[163969]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:19:59 np0005626463.localdomain python3.9[164062]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:20:00 np0005626463.localdomain sudo[164156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqnahudnaecdyqjmfjpiqmnskukuvrpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838400.015972-57-240173606434716/AnsiballZ_command.py
Feb 23 09:20:00 np0005626463.localdomain sudo[164156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:00 np0005626463.localdomain python3.9[164158]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:00 np0005626463.localdomain sudo[164156]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:00 np0005626463.localdomain sshd[164183]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37824 DF PROTO=TCP SPT=45092 DPT=9100 SEQ=961740688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBC7870000000001030307) 
Feb 23 09:20:01 np0005626463.localdomain sudo[164263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bojtuftsbaoqmnhzxbyvtykyxpkbfpip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838400.9713461-81-227423879961582/AnsiballZ_command.py
Feb 23 09:20:01 np0005626463.localdomain sudo[164263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:01 np0005626463.localdomain python3.9[164265]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:01 np0005626463.localdomain systemd[1]: tmp-crun.x18woI.mount: Deactivated successfully.
Feb 23 09:20:01 np0005626463.localdomain systemd[1]: libpod-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope: Deactivated successfully.
Feb 23 09:20:01 np0005626463.localdomain podman[164266]: 2026-02-23 09:20:01.54765599 +0000 UTC m=+0.083597434 container died 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 23 09:20:01 np0005626463.localdomain podman[164266]: 2026-02-23 09:20:01.59358603 +0000 UTC m=+0.129527474 container cleanup 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:31:49Z, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team)
Feb 23 09:20:01 np0005626463.localdomain sudo[164263]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:01 np0005626463.localdomain podman[164280]: 2026-02-23 09:20:01.64244578 +0000 UTC m=+0.087175506 container remove 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, io.openshift.expose-services=)
Feb 23 09:20:01 np0005626463.localdomain systemd[1]: libpod-conmon-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope: Deactivated successfully.
Feb 23 09:20:02 np0005626463.localdomain sshd[164183]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:20:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5-merged.mount: Deactivated successfully.
Feb 23 09:20:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74-userdata-shm.mount: Deactivated successfully.
Feb 23 09:20:02 np0005626463.localdomain sudo[164385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilhhmlxzgqqvxzkkfnctifcqoaotwitq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838401.9326777-111-131878527339509/AnsiballZ_systemd_service.py
Feb 23 09:20:02 np0005626463.localdomain sudo[164385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:02 np0005626463.localdomain python3.9[164387]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:20:02 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:20:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37825 DF PROTO=TCP SPT=45092 DPT=9100 SEQ=961740688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBCF860000000001030307) 
Feb 23 09:20:02 np0005626463.localdomain systemd-rc-local-generator[164413]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:20:02 np0005626463.localdomain systemd-sysv-generator[164416]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:20:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:20:03 np0005626463.localdomain sudo[164385]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:03 np0005626463.localdomain python3.9[164512]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:20:03 np0005626463.localdomain network[164529]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:20:03 np0005626463.localdomain network[164530]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:20:03 np0005626463.localdomain network[164531]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:20:05 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:20:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15488 DF PROTO=TCP SPT=46708 DPT=9101 SEQ=3722656640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBDC460000000001030307) 
Feb 23 09:20:08 np0005626463.localdomain sudo[164730]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkjliqllarzefwdtwukaiwhuyfjjlysd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838408.4637125-168-254233500837964/AnsiballZ_systemd_service.py
Feb 23 09:20:08 np0005626463.localdomain sudo[164730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:09 np0005626463.localdomain python3.9[164732]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:20:09 np0005626463.localdomain systemd-sysv-generator[164759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:20:09 np0005626463.localdomain systemd-rc-local-generator[164751]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:20:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:20:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16153 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBE8A30000000001030307) 
Feb 23 09:20:09 np0005626463.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Feb 23 09:20:09 np0005626463.localdomain sudo[164730]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:09 np0005626463.localdomain sudo[164861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmsjqqukkzdijwhtssggpsgdotylyxty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838409.6346169-168-275487543533959/AnsiballZ_systemd_service.py
Feb 23 09:20:09 np0005626463.localdomain sudo[164861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:10 np0005626463.localdomain python3.9[164863]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:10 np0005626463.localdomain sudo[164861]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:11 np0005626463.localdomain sudo[164954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vteahxphlhosxslniygjhwgscjysisqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838411.2299688-168-132187852425443/AnsiballZ_systemd_service.py
Feb 23 09:20:11 np0005626463.localdomain sudo[164954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:20:11 np0005626463.localdomain podman[164957]: 2026-02-23 09:20:11.656303803 +0000 UTC m=+0.093948805 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216)
Feb 23 09:20:11 np0005626463.localdomain podman[164957]: 2026-02-23 09:20:11.694511813 +0000 UTC m=+0.132156845 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:20:11 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:20:11 np0005626463.localdomain python3.9[164956]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:11 np0005626463.localdomain sudo[164954]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:12 np0005626463.localdomain sudo[165040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:20:12 np0005626463.localdomain sudo[165040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:20:12 np0005626463.localdomain sudo[165040]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:12 np0005626463.localdomain sudo[165085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjjwophqkijumrjclkhohghtoyvtpjzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838412.0135763-168-113212631688256/AnsiballZ_systemd_service.py
Feb 23 09:20:12 np0005626463.localdomain sudo[165085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:12 np0005626463.localdomain sudo[165086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:20:12 np0005626463.localdomain sudo[165086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:20:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16155 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBF4C60000000001030307) 
Feb 23 09:20:12 np0005626463.localdomain python3.9[165100]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:12 np0005626463.localdomain sudo[165086]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:13 np0005626463.localdomain sudo[165135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:20:13 np0005626463.localdomain sudo[165135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:20:13 np0005626463.localdomain sudo[165135]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:13 np0005626463.localdomain sudo[165085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:14 np0005626463.localdomain sudo[165239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqsvfhxpxukfsgmykznssupncheaxuee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838413.7937112-168-61193631159446/AnsiballZ_systemd_service.py
Feb 23 09:20:14 np0005626463.localdomain sudo[165239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:14 np0005626463.localdomain python3.9[165241]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:14 np0005626463.localdomain sudo[165239]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:14 np0005626463.localdomain sudo[165332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwxkzeezbtcnpbxsvsnhryzzitdsprfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838414.503728-168-33686576616846/AnsiballZ_systemd_service.py
Feb 23 09:20:14 np0005626463.localdomain sudo[165332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:15 np0005626463.localdomain python3.9[165334]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:15 np0005626463.localdomain sudo[165332]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21996 DF PROTO=TCP SPT=43090 DPT=9105 SEQ=1258698052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC00060000000001030307) 
Feb 23 09:20:15 np0005626463.localdomain sudo[165425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cawcojvppbpyyhmgvvyvqwdcxzektece ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838415.2635987-168-193373561554981/AnsiballZ_systemd_service.py
Feb 23 09:20:15 np0005626463.localdomain sudo[165425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:15 np0005626463.localdomain python3.9[165427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:20:15 np0005626463.localdomain sudo[165425]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:16 np0005626463.localdomain sshd[165443]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:17 np0005626463.localdomain sshd[165443]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:20:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:20:17 np0005626463.localdomain podman[165477]: 2026-02-23 09:20:17.887164154 +0000 UTC m=+0.063868846 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:20:17 np0005626463.localdomain podman[165477]: 2026-02-23 09:20:17.893270912 +0000 UTC m=+0.069975584 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:20:17 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:20:18 np0005626463.localdomain sudo[165537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrmohmcatolvkxpzeutheuztpejckyzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838417.619591-324-42113890714505/AnsiballZ_file.py
Feb 23 09:20:18 np0005626463.localdomain sudo[165537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:18 np0005626463.localdomain python3.9[165539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:18 np0005626463.localdomain sudo[165537]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15490 DF PROTO=TCP SPT=46708 DPT=9101 SEQ=3722656640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC0C070000000001030307) 
Feb 23 09:20:18 np0005626463.localdomain sudo[165629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toystqzrekonbddixfxfrdadhccxwmbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838418.3904176-324-195703136615606/AnsiballZ_file.py
Feb 23 09:20:18 np0005626463.localdomain sudo[165629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:18 np0005626463.localdomain python3.9[165631]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:18 np0005626463.localdomain sudo[165629]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:19 np0005626463.localdomain sudo[165721]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haaprwjckrsvjrpioviqjnzxsrvpybqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838418.9669392-324-70866291090080/AnsiballZ_file.py
Feb 23 09:20:19 np0005626463.localdomain sudo[165721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:19 np0005626463.localdomain python3.9[165723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:19 np0005626463.localdomain sudo[165721]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:19 np0005626463.localdomain sudo[165813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppphguatxlgzydyeqbzorahmakrswrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838419.576376-324-120556221422425/AnsiballZ_file.py
Feb 23 09:20:19 np0005626463.localdomain sudo[165813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:20 np0005626463.localdomain python3.9[165815]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:20 np0005626463.localdomain sudo[165813]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:20 np0005626463.localdomain sudo[165905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmnlrkantorvhnrhruqvjqfzlifeedmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838420.1477263-324-61780550893889/AnsiballZ_file.py
Feb 23 09:20:20 np0005626463.localdomain sudo[165905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:20 np0005626463.localdomain python3.9[165907]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:20 np0005626463.localdomain sudo[165905]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:21 np0005626463.localdomain sudo[165997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztksvdibrzygxxkuvyrvpqleevkvlrdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838420.7349164-324-93634672272209/AnsiballZ_file.py
Feb 23 09:20:21 np0005626463.localdomain sudo[165997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:21 np0005626463.localdomain python3.9[165999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:21 np0005626463.localdomain sudo[165997]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:22 np0005626463.localdomain sudo[166089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkwbfaclptbiyrnumifwnhosdmzhjufv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838421.9399524-324-87990306591720/AnsiballZ_file.py
Feb 23 09:20:22 np0005626463.localdomain sudo[166089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:22 np0005626463.localdomain python3.9[166091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:22 np0005626463.localdomain sudo[166089]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:22 np0005626463.localdomain sudo[166181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqnnbdcjxjdfzhduwfbarinchfdomcvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838422.5835025-474-8792222365888/AnsiballZ_file.py
Feb 23 09:20:22 np0005626463.localdomain sudo[166181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:23 np0005626463.localdomain python3.9[166183]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:23 np0005626463.localdomain sudo[166181]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58851 DF PROTO=TCP SPT=56256 DPT=9102 SEQ=1936941513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC22200000000001030307) 
Feb 23 09:20:24 np0005626463.localdomain sudo[166273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpdspyknyrlvkerphjewencghzrwhrhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838423.7576537-474-231869175722162/AnsiballZ_file.py
Feb 23 09:20:24 np0005626463.localdomain sudo[166273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:24 np0005626463.localdomain python3.9[166275]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:24 np0005626463.localdomain sudo[166273]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16157 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC24060000000001030307) 
Feb 23 09:20:24 np0005626463.localdomain sudo[166365]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exhqklwwcspekyqndmxxnnzaxdctaqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838424.481617-474-197492220617111/AnsiballZ_file.py
Feb 23 09:20:24 np0005626463.localdomain sudo[166365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:24 np0005626463.localdomain python3.9[166367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:24 np0005626463.localdomain sudo[166365]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:25 np0005626463.localdomain sudo[166457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsxisksfymrluqnnbojqukjmzxegyrha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838425.0869906-474-243239831117207/AnsiballZ_file.py
Feb 23 09:20:25 np0005626463.localdomain sudo[166457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:25 np0005626463.localdomain python3.9[166459]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:25 np0005626463.localdomain sudo[166457]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:25 np0005626463.localdomain sudo[166549]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doifspaqifaseumwiumdixjfzxvpnvax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838425.716988-474-260511648914343/AnsiballZ_file.py
Feb 23 09:20:25 np0005626463.localdomain sudo[166549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:26 np0005626463.localdomain python3.9[166551]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:26 np0005626463.localdomain sudo[166549]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:26 np0005626463.localdomain sudo[166641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hncbuwkrfxjiwjrvkpympshldghydohx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838426.294059-474-194016298269935/AnsiballZ_file.py
Feb 23 09:20:26 np0005626463.localdomain sudo[166641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:26 np0005626463.localdomain python3.9[166643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:26 np0005626463.localdomain sudo[166641]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:27 np0005626463.localdomain sudo[166733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skulbmuqcpbfrwmkswbpgwdhtskrfmcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838426.905438-474-48372776569521/AnsiballZ_file.py
Feb 23 09:20:27 np0005626463.localdomain sudo[166733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58853 DF PROTO=TCP SPT=56256 DPT=9102 SEQ=1936941513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC2E470000000001030307) 
Feb 23 09:20:27 np0005626463.localdomain python3.9[166735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:20:27 np0005626463.localdomain sudo[166733]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:27 np0005626463.localdomain sudo[166825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgfikckvxgwdhxjwtxpdwupgpmieghhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838427.7250118-627-119794392740593/AnsiballZ_command.py
Feb 23 09:20:27 np0005626463.localdomain sudo[166825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:28 np0005626463.localdomain python3.9[166827]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:28 np0005626463.localdomain sudo[166825]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:29 np0005626463.localdomain python3.9[166919]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:20:29 np0005626463.localdomain sudo[167009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfadrlavtupeirtbrclqmfcmiyjllvhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838429.3226683-681-203106078437099/AnsiballZ_systemd_service.py
Feb 23 09:20:29 np0005626463.localdomain sudo[167009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:29 np0005626463.localdomain python3.9[167011]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:20:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:20:30 np0005626463.localdomain systemd-sysv-generator[167037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:20:30 np0005626463.localdomain systemd-rc-local-generator[167033]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:20:30 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:20:30 np0005626463.localdomain sudo[167009]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:30 np0005626463.localdomain sudo[167137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipkptfhegxyuecyotbuadtikfsuvqgky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838430.4179459-705-174942510929946/AnsiballZ_command.py
Feb 23 09:20:30 np0005626463.localdomain sudo[167137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53614 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC3C860000000001030307) 
Feb 23 09:20:30 np0005626463.localdomain python3.9[167139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:30 np0005626463.localdomain sudo[167137]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:31 np0005626463.localdomain sudo[167230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sljetymwftddpvrtukcqdwujrnhxdxpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838431.0127065-705-74988992600263/AnsiballZ_command.py
Feb 23 09:20:31 np0005626463.localdomain sudo[167230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:31 np0005626463.localdomain python3.9[167232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:31 np0005626463.localdomain sudo[167230]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:31 np0005626463.localdomain sudo[167323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpvxdfndtuzjjotgjofkpphmasdatxsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838431.6405663-705-172943000865591/AnsiballZ_command.py
Feb 23 09:20:31 np0005626463.localdomain sudo[167323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:32 np0005626463.localdomain python3.9[167325]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:32 np0005626463.localdomain sudo[167323]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:32 np0005626463.localdomain sudo[167416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tayfrcvlkdafbhobyfevxwcedheodeyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838432.2059574-705-87423823518564/AnsiballZ_command.py
Feb 23 09:20:32 np0005626463.localdomain sudo[167416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:32 np0005626463.localdomain python3.9[167418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:32 np0005626463.localdomain sudo[167416]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53615 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC44860000000001030307) 
Feb 23 09:20:33 np0005626463.localdomain sudo[167509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgigphyjlotdkdadcpmewokfdyvdkfzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838432.7936468-705-159245971999974/AnsiballZ_command.py
Feb 23 09:20:33 np0005626463.localdomain sudo[167509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:33 np0005626463.localdomain python3.9[167511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:33 np0005626463.localdomain sudo[167509]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:34 np0005626463.localdomain sudo[167602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjnwboytssulvrzmshchvacaahcssgrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838433.9313574-705-32596807020454/AnsiballZ_command.py
Feb 23 09:20:34 np0005626463.localdomain sudo[167602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:34 np0005626463.localdomain python3.9[167604]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:35 np0005626463.localdomain sudo[167602]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25537 DF PROTO=TCP SPT=39442 DPT=9101 SEQ=990721497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC51860000000001030307) 
Feb 23 09:20:36 np0005626463.localdomain sshd[167620]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:36 np0005626463.localdomain sudo[167696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwjbxkgebdjrxwmkdqjwpcrvzrioaroq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838436.357801-705-80612854226059/AnsiballZ_command.py
Feb 23 09:20:36 np0005626463.localdomain sudo[167696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:36 np0005626463.localdomain python3.9[167698]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:20:36 np0005626463.localdomain sudo[167696]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:38 np0005626463.localdomain sudo[167790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gayntjydcorpsfpxcudpqhslstbkguky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838437.5876384-867-149512023535308/AnsiballZ_getent.py
Feb 23 09:20:38 np0005626463.localdomain sudo[167790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:38 np0005626463.localdomain python3.9[167792]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 23 09:20:38 np0005626463.localdomain sudo[167790]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:38 np0005626463.localdomain sshd[167620]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:20:38 np0005626463.localdomain sudo[167883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlcxntberldypemnybjfwblhizpzikcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838438.3977716-891-2401719900881/AnsiballZ_group.py
Feb 23 09:20:38 np0005626463.localdomain sudo[167883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:38 np0005626463.localdomain python3.9[167885]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 09:20:39 np0005626463.localdomain groupadd[167886]: group added to /etc/group: name=libvirt, GID=42473
Feb 23 09:20:39 np0005626463.localdomain groupadd[167886]: group added to /etc/gshadow: name=libvirt
Feb 23 09:20:39 np0005626463.localdomain groupadd[167886]: new group: name=libvirt, GID=42473
Feb 23 09:20:39 np0005626463.localdomain sudo[167883]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11812 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC5DD30000000001030307) 
Feb 23 09:20:39 np0005626463.localdomain sshd[167938]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:39 np0005626463.localdomain sudo[167983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmbpnrwhqwmhggxhzvulxajhpagtnlvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838439.284707-915-241241543072210/AnsiballZ_user.py
Feb 23 09:20:39 np0005626463.localdomain sudo[167983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:39 np0005626463.localdomain python3.9[167985]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 09:20:40 np0005626463.localdomain useradd[167987]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 09:20:40 np0005626463.localdomain sudo[167983]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:40 np0005626463.localdomain sudo[168083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whauusblabzyfqaemhlvlsgfpzytpxit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838440.553204-948-246939463704830/AnsiballZ_setup.py
Feb 23 09:20:40 np0005626463.localdomain sudo[168083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:41 np0005626463.localdomain python3.9[168085]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:20:41 np0005626463.localdomain sudo[168083]: pam_unix(sudo:session): session closed for user root
Feb 23 09:20:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:20:41 np0005626463.localdomain sudo[168146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcawovzkddbkdsrxaxmpyomijqxfamrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838440.553204-948-246939463704830/AnsiballZ_dnf.py
Feb 23 09:20:41 np0005626463.localdomain sudo[168146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:20:41 np0005626463.localdomain podman[168122]: 2026-02-23 09:20:41.913499081 +0000 UTC m=+0.081169869 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, config_id=ovn_controller)
Feb 23 09:20:41 np0005626463.localdomain podman[168122]: 2026-02-23 09:20:41.951587621 +0000 UTC m=+0.119258469 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:20:41 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:20:42 np0005626463.localdomain python3.9[168152]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:20:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11814 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC69C70000000001030307) 
Feb 23 09:20:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53617 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC74060000000001030307) 
Feb 23 09:20:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:20:48.520 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:20:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:20:48.521 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:20:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:20:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:20:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25539 DF PROTO=TCP SPT=39442 DPT=9101 SEQ=990721497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC82060000000001030307) 
Feb 23 09:20:48 np0005626463.localdomain sshd[167938]: error: maximum authentication attempts exceeded for root from 108.87.129.34 port 47174 ssh2 [preauth]
Feb 23 09:20:48 np0005626463.localdomain sshd[167938]: Disconnecting authenticating user root 108.87.129.34 port 47174: Too many authentication failures [preauth]
Feb 23 09:20:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:20:48 np0005626463.localdomain sshd[168242]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:48 np0005626463.localdomain podman[168231]: 2026-02-23 09:20:48.93708936 +0000 UTC m=+0.111240642 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:20:48 np0005626463.localdomain podman[168231]: 2026-02-23 09:20:48.968656956 +0000 UTC m=+0.142808268 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:20:48 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:20:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17276 DF PROTO=TCP SPT=55316 DPT=9102 SEQ=1337298762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC974F0000000001030307) 
Feb 23 09:20:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11816 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC9A070000000001030307) 
Feb 23 09:20:55 np0005626463.localdomain sshd[168255]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:20:55 np0005626463.localdomain sshd[168255]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:20:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17278 DF PROTO=TCP SPT=55316 DPT=9102 SEQ=1337298762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECA3470000000001030307) 
Feb 23 09:20:58 np0005626463.localdomain sshd[168242]: Connection closed by 108.87.129.34 port 47220 [preauth]
Feb 23 09:21:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43961 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECB1C60000000001030307) 
Feb 23 09:21:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43962 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECB9C60000000001030307) 
Feb 23 09:21:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8066 DF PROTO=TCP SPT=54296 DPT=9101 SEQ=3613526108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECC6C60000000001030307) 
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  Converting 2759 SID table entries...
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:21:08 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:21:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56533 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECD3020000000001030307) 
Feb 23 09:21:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56535 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECDF060000000001030307) 
Feb 23 09:21:12 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=19 res=1
Feb 23 09:21:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:21:12 np0005626463.localdomain systemd[1]: tmp-crun.jNkysz.mount: Deactivated successfully.
Feb 23 09:21:12 np0005626463.localdomain podman[169328]: 2026-02-23 09:21:12.929596836 +0000 UTC m=+0.094096593 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:21:12 np0005626463.localdomain podman[169328]: 2026-02-23 09:21:12.977298391 +0000 UTC m=+0.141798108 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 09:21:12 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:21:13 np0005626463.localdomain sudo[169354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:21:13 np0005626463.localdomain sudo[169354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:21:13 np0005626463.localdomain sudo[169354]: pam_unix(sudo:session): session closed for user root
Feb 23 09:21:13 np0005626463.localdomain sudo[169372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:21:13 np0005626463.localdomain sudo[169372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:21:14 np0005626463.localdomain sudo[169372]: pam_unix(sudo:session): session closed for user root
Feb 23 09:21:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43964 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECEA060000000001030307) 
Feb 23 09:21:15 np0005626463.localdomain sshd[169422]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:21:16 np0005626463.localdomain sshd[169422]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:21:17 np0005626463.localdomain sudo[169424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:21:17 np0005626463.localdomain sudo[169424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:21:17 np0005626463.localdomain sudo[169424]: pam_unix(sudo:session): session closed for user root
Feb 23 09:21:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8068 DF PROTO=TCP SPT=54296 DPT=9101 SEQ=3613526108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECF6060000000001030307) 
Feb 23 09:21:18 np0005626463.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:21:19 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:21:19 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Feb 23 09:21:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:21:19 np0005626463.localdomain systemd[1]: tmp-crun.r0HF94.mount: Deactivated successfully.
Feb 23 09:21:19 np0005626463.localdomain podman[169449]: 2026-02-23 09:21:19.955531714 +0000 UTC m=+0.111532861 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:21:19 np0005626463.localdomain podman[169449]: 2026-02-23 09:21:19.985443062 +0000 UTC m=+0.141444129 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:21:19 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:21:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25007 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED0C7F0000000001030307) 
Feb 23 09:21:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56537 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED10060000000001030307) 
Feb 23 09:21:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25009 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED18860000000001030307) 
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  Converting 2765 SID table entries...
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:21:29 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:21:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24876 DF PROTO=TCP SPT=43368 DPT=9100 SEQ=4035075529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED27060000000001030307) 
Feb 23 09:21:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24877 DF PROTO=TCP SPT=43368 DPT=9100 SEQ=4035075529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED2F060000000001030307) 
Feb 23 09:21:33 np0005626463.localdomain sshd[169480]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:21:33 np0005626463.localdomain sshd[169480]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:21:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3862 DF PROTO=TCP SPT=59578 DPT=9101 SEQ=3952865216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED3BC60000000001030307) 
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  Converting 2765 SID table entries...
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:21:37 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:21:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:21:38 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Feb 23 09:21:38 np0005626463.localdomain systemd-rc-local-generator[169509]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:21:38 np0005626463.localdomain systemd-sysv-generator[169518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:21:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:21:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:21:38 np0005626463.localdomain systemd-rc-local-generator[169552]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:21:38 np0005626463.localdomain systemd-sysv-generator[169555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:21:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:21:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25011 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED48060000000001030307) 
Feb 23 09:21:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16364 DF PROTO=TCP SPT=34588 DPT=9882 SEQ=3469396630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED54460000000001030307) 
Feb 23 09:21:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:21:43 np0005626463.localdomain podman[169570]: 2026-02-23 09:21:43.933144033 +0000 UTC m=+0.096526864 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 23 09:21:44 np0005626463.localdomain podman[169570]: 2026-02-23 09:21:44.014200787 +0000 UTC m=+0.177583618 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:21:44 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:21:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22553 DF PROTO=TCP SPT=39002 DPT=9105 SEQ=3661566410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED5E060000000001030307) 
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  Converting 2766 SID table entries...
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 23 09:21:47 np0005626463.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 23 09:21:48 np0005626463.localdomain groupadd[169605]: group added to /etc/group: name=clevis, GID=985
Feb 23 09:21:48 np0005626463.localdomain groupadd[169605]: group added to /etc/gshadow: name=clevis
Feb 23 09:21:48 np0005626463.localdomain groupadd[169605]: new group: name=clevis, GID=985
Feb 23 09:21:48 np0005626463.localdomain useradd[169612]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 23 09:21:48 np0005626463.localdomain usermod[169622]: add 'clevis' to group 'tss'
Feb 23 09:21:48 np0005626463.localdomain usermod[169622]: add 'clevis' to shadow group 'tss'
Feb 23 09:21:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:21:48.522 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:21:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:21:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:21:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:21:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:21:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3864 DF PROTO=TCP SPT=59578 DPT=9101 SEQ=3952865216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED6C060000000001030307) 
Feb 23 09:21:49 np0005626463.localdomain sshd[169643]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:21:50 np0005626463.localdomain sshd[169643]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:21:50 np0005626463.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Feb 23 09:21:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:21:50 np0005626463.localdomain podman[169647]: 2026-02-23 09:21:50.959545125 +0000 UTC m=+0.147809282 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:21:50 np0005626463.localdomain podman[169647]: 2026-02-23 09:21:50.993383093 +0000 UTC m=+0.181647250 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:21:51 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:21:51 np0005626463.localdomain groupadd[169668]: group added to /etc/group: name=dnsmasq, GID=984
Feb 23 09:21:51 np0005626463.localdomain groupadd[169668]: group added to /etc/gshadow: name=dnsmasq
Feb 23 09:21:51 np0005626463.localdomain groupadd[169668]: new group: name=dnsmasq, GID=984
Feb 23 09:21:51 np0005626463.localdomain useradd[169675]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 23 09:21:51 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 09:21:51 np0005626463.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 23 09:21:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10718 DF PROTO=TCP SPT=59078 DPT=9102 SEQ=2991786459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED81B00000000001030307) 
Feb 23 09:21:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16366 DF PROTO=TCP SPT=34588 DPT=9882 SEQ=3469396630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED84060000000001030307) 
Feb 23 09:21:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10720 DF PROTO=TCP SPT=59078 DPT=9102 SEQ=2991786459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED8DC60000000001030307) 
Feb 23 09:22:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2967 DF PROTO=TCP SPT=59764 DPT=9100 SEQ=3098478177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED9C460000000001030307) 
Feb 23 09:22:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2968 DF PROTO=TCP SPT=59764 DPT=9100 SEQ=3098478177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDA4460000000001030307) 
Feb 23 09:22:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10913 DF PROTO=TCP SPT=33190 DPT=9101 SEQ=2169928675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDB1060000000001030307) 
Feb 23 09:22:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11696 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDBD630000000001030307) 
Feb 23 09:22:11 np0005626463.localdomain sshd[174582]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:22:11 np0005626463.localdomain sshd[174582]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:22:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11698 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDC9860000000001030307) 
Feb 23 09:22:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:22:14 np0005626463.localdomain podman[177260]: 2026-02-23 09:22:14.929427063 +0000 UTC m=+0.095252929 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 09:22:14 np0005626463.localdomain podman[177260]: 2026-02-23 09:22:14.994733321 +0000 UTC m=+0.160559177 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 09:22:15 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:22:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23919 DF PROTO=TCP SPT=59430 DPT=9105 SEQ=2478478734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDD4070000000001030307) 
Feb 23 09:22:17 np0005626463.localdomain sudo[178899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:22:17 np0005626463.localdomain sudo[178899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:22:17 np0005626463.localdomain sudo[178899]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:17 np0005626463.localdomain sudo[178987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:22:17 np0005626463.localdomain sudo[178987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:22:17 np0005626463.localdomain sudo[178987]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:18 np0005626463.localdomain sudo[179435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:22:18 np0005626463.localdomain sudo[179435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:22:18 np0005626463.localdomain sudo[179435]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:18 np0005626463.localdomain sudo[179508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:22:18 np0005626463.localdomain sudo[179508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:22:18 np0005626463.localdomain sudo[179508]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10915 DF PROTO=TCP SPT=33190 DPT=9101 SEQ=2169928675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDE2060000000001030307) 
Feb 23 09:22:19 np0005626463.localdomain sudo[180445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:22:19 np0005626463.localdomain sudo[180445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:22:19 np0005626463.localdomain sudo[180445]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:22:21 np0005626463.localdomain podman[182164]: 2026-02-23 09:22:21.909821323 +0000 UTC m=+0.081858571 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:22:21 np0005626463.localdomain podman[182164]: 2026-02-23 09:22:21.941863924 +0000 UTC m=+0.113901192 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 23 09:22:22 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:22:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65527 DF PROTO=TCP SPT=54138 DPT=9102 SEQ=3681896413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDF6E10000000001030307) 
Feb 23 09:22:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11700 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDFA070000000001030307) 
Feb 23 09:22:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65529 DF PROTO=TCP SPT=54138 DPT=9102 SEQ=3681896413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE03060000000001030307) 
Feb 23 09:22:28 np0005626463.localdomain sshd[186755]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Reloading rules
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Collecting garbage unconditionally...
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Finished loading, compiling and executing 5 rules
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Reloading rules
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Collecting garbage unconditionally...
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 23 09:22:29 np0005626463.localdomain polkitd[1033]: Finished loading, compiling and executing 5 rules
Feb 23 09:22:29 np0005626463.localdomain sshd[186755]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:22:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8466 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE11460000000001030307) 
Feb 23 09:22:31 np0005626463.localdomain groupadd[186991]: group added to /etc/group: name=ceph, GID=167
Feb 23 09:22:31 np0005626463.localdomain groupadd[186991]: group added to /etc/gshadow: name=ceph
Feb 23 09:22:31 np0005626463.localdomain groupadd[186991]: new group: name=ceph, GID=167
Feb 23 09:22:31 np0005626463.localdomain useradd[186997]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 23 09:22:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8467 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE19460000000001030307) 
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 23 09:22:34 np0005626463.localdomain sshd[122114]: Received signal 15; terminating.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: sshd.service: Consumed 2.617s CPU time, read 32.0K from disk, written 0B to disk.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 23 09:22:34 np0005626463.localdomain sshd[187670]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:22:34 np0005626463.localdomain sshd[187670]: Server listening on 0.0.0.0 port 22.
Feb 23 09:22:34 np0005626463.localdomain sshd[187670]: Server listening on :: port 22.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55417 DF PROTO=TCP SPT=48002 DPT=9101 SEQ=1767193969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE26470000000001030307) 
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:36 np0005626463.localdomain systemd-rc-local-generator[188133]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:36 np0005626463.localdomain systemd-sysv-generator[188139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:36 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 09:22:37 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:22:39 np0005626463.localdomain sudo[168146]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21951 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE32930000000001030307) 
Feb 23 09:22:40 np0005626463.localdomain sudo[192566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqnfldgetkyiobrhbxhxgkcqmybndrai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838560.0636497-984-137075416405201/AnsiballZ_systemd.py
Feb 23 09:22:40 np0005626463.localdomain sudo[192566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:40 np0005626463.localdomain python3.9[192593]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:22:40 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:41 np0005626463.localdomain systemd-sysv-generator[192910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:41 np0005626463.localdomain systemd-rc-local-generator[192906]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:41 np0005626463.localdomain sudo[192566]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21953 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE3E860000000001030307) 
Feb 23 09:22:42 np0005626463.localdomain sudo[194051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oujiohljepekfsuckrnenexrerybgzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838561.5335593-984-153239883074541/AnsiballZ_systemd.py
Feb 23 09:22:42 np0005626463.localdomain sudo[194051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:42 np0005626463.localdomain python3.9[194073]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:43 np0005626463.localdomain systemd-rc-local-generator[194165]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:43 np0005626463.localdomain systemd-sysv-generator[194171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:43 np0005626463.localdomain sudo[194051]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:43 np0005626463.localdomain sudo[194542]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxliksklaumwfsslidyjyjwatibjworh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838563.4916558-984-264650159302176/AnsiballZ_systemd.py
Feb 23 09:22:43 np0005626463.localdomain sudo[194542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:44 np0005626463.localdomain python3.9[194559]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:45 np0005626463.localdomain podman[195150]: 2026-02-23 09:22:45.269435214 +0000 UTC m=+0.110032897 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:22:45 np0005626463.localdomain systemd-rc-local-generator[195248]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:45 np0005626463.localdomain systemd-sysv-generator[195254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:45 np0005626463.localdomain podman[195150]: 2026-02-23 09:22:45.336637348 +0000 UTC m=+0.177235031 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8469 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE4A060000000001030307) 
Feb 23 09:22:45 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:22:45 np0005626463.localdomain sudo[194542]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:45 np0005626463.localdomain sudo[195644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvblskgfwnhoibqrspptjynhueysdwoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838565.6724248-984-200936177237969/AnsiballZ_systemd.py
Feb 23 09:22:45 np0005626463.localdomain sudo[195644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:46 np0005626463.localdomain python3.9[195663]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:46 np0005626463.localdomain systemd-rc-local-generator[195929]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:46 np0005626463.localdomain systemd-sysv-generator[195935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:46 np0005626463.localdomain sudo[195644]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:47 np0005626463.localdomain sudo[196341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzvsfnvbonjneqnhfmvrjfepzcyxbnvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838566.9473228-1071-119656055084945/AnsiballZ_systemd.py
Feb 23 09:22:47 np0005626463.localdomain sudo[196341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:47 np0005626463.localdomain python3.9[196360]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:47 np0005626463.localdomain systemd-rc-local-generator[196588]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:47 np0005626463.localdomain systemd-sysv-generator[196593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:47 np0005626463.localdomain sudo[196341]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:48 np0005626463.localdomain sudo[196975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhkgrlevhvrxoxqsvonwreatofuztxmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838568.0773714-1071-109090604144179/AnsiballZ_systemd.py
Feb 23 09:22:48 np0005626463.localdomain sudo[196975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55419 DF PROTO=TCP SPT=48002 DPT=9101 SEQ=1767193969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE56060000000001030307) 
Feb 23 09:22:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:22:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:22:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:22:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:22:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:22:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:22:48 np0005626463.localdomain python3.9[196997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:49 np0005626463.localdomain sshd[197368]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Consumed 15.645s CPU time.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: run-ra83127cecaa34d288cf51053ffd142d5.service: Deactivated successfully.
Feb 23 09:22:49 np0005626463.localdomain sshd[197368]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: run-rd1915ba4ef78472e8e99bcebfa8e98ea.service: Deactivated successfully.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:49 np0005626463.localdomain systemd-sysv-generator[197513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:49 np0005626463.localdomain systemd-rc-local-generator[197508]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:50 np0005626463.localdomain sudo[196975]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:50 np0005626463.localdomain sudo[197630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuhwhfqgdueicatnemfmkyhndbtlwdjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838570.205324-1071-148171829051356/AnsiballZ_systemd.py
Feb 23 09:22:50 np0005626463.localdomain sudo[197630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:50 np0005626463.localdomain python3.9[197632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:51 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:51 np0005626463.localdomain systemd-rc-local-generator[197656]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:51 np0005626463.localdomain systemd-sysv-generator[197661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:22:52 np0005626463.localdomain sudo[197630]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:52 np0005626463.localdomain podman[197672]: 2026-02-23 09:22:52.30053506 +0000 UTC m=+0.087245901 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:22:52 np0005626463.localdomain podman[197672]: 2026-02-23 09:22:52.310167372 +0000 UTC m=+0.096878183 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:22:52 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:22:53 np0005626463.localdomain sudo[197798]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxnfwezvyqllcpagwqqygqrsmswpblch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838573.0669847-1071-201019455365083/AnsiballZ_systemd.py
Feb 23 09:22:53 np0005626463.localdomain sudo[197798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:53 np0005626463.localdomain python3.9[197800]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:53 np0005626463.localdomain sudo[197798]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60293 DF PROTO=TCP SPT=36928 DPT=9102 SEQ=261598781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE6C110000000001030307) 
Feb 23 09:22:54 np0005626463.localdomain sudo[197911]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-borfflytwughjylhdvfdorpbsqyqfxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838573.8819318-1071-203143611644949/AnsiballZ_systemd.py
Feb 23 09:22:54 np0005626463.localdomain sudo[197911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:54 np0005626463.localdomain python3.9[197913]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21955 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE6E060000000001030307) 
Feb 23 09:22:54 np0005626463.localdomain systemd-sysv-generator[197944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:54 np0005626463.localdomain systemd-rc-local-generator[197937]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:54 np0005626463.localdomain sudo[197911]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:56 np0005626463.localdomain sudo[198059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkyyvzrbynwziqsfotqbqlkfiqpuqlwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838576.2788455-1179-108239840041162/AnsiballZ_systemd.py
Feb 23 09:22:56 np0005626463.localdomain sudo[198059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:56 np0005626463.localdomain python3.9[198061]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:22:56 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:22:57 np0005626463.localdomain systemd-rc-local-generator[198091]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:22:57 np0005626463.localdomain systemd-sysv-generator[198094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:22:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60295 DF PROTO=TCP SPT=36928 DPT=9102 SEQ=261598781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE78060000000001030307) 
Feb 23 09:22:57 np0005626463.localdomain sudo[198059]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:57 np0005626463.localdomain sudo[198208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdhoyfqfqyutxeapphyvubvamdhjgjdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838577.4803932-1203-235430116128589/AnsiballZ_systemd.py
Feb 23 09:22:57 np0005626463.localdomain sudo[198208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:58 np0005626463.localdomain python3.9[198210]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:22:58 np0005626463.localdomain sudo[198208]: pam_unix(sudo:session): session closed for user root
Feb 23 09:22:58 np0005626463.localdomain sudo[198321]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezbgypopbwkianqeaiptusbgzvjcwubt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838578.291335-1203-280839928600759/AnsiballZ_systemd.py
Feb 23 09:22:58 np0005626463.localdomain sudo[198321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:22:59 np0005626463.localdomain python3.9[198323]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:00 np0005626463.localdomain sudo[198321]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:00 np0005626463.localdomain sudo[198434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfjnrzmmofpoaoewdcitvnxnyfdlkkse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838580.5016675-1203-229264806743630/AnsiballZ_systemd.py
Feb 23 09:23:00 np0005626463.localdomain sudo[198434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49530 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE86860000000001030307) 
Feb 23 09:23:01 np0005626463.localdomain python3.9[198436]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:01 np0005626463.localdomain sudo[198434]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:01 np0005626463.localdomain sudo[198547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rljmahbzryktyjvxswerkldsjednrupq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838581.2522345-1203-190172079570340/AnsiballZ_systemd.py
Feb 23 09:23:01 np0005626463.localdomain sudo[198547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:01 np0005626463.localdomain python3.9[198549]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:01 np0005626463.localdomain sudo[198547]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:02 np0005626463.localdomain sudo[198660]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mixkcoiajemdxikrofnskshkdpcccjwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838582.0439084-1203-186106937206341/AnsiballZ_systemd.py
Feb 23 09:23:02 np0005626463.localdomain sudo[198660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:02 np0005626463.localdomain python3.9[198662]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:02 np0005626463.localdomain sudo[198660]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49531 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE8E860000000001030307) 
Feb 23 09:23:03 np0005626463.localdomain sudo[198773]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkivtnyvdjkjlhswyhfchftshfyprhia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838582.9047668-1203-252180012711532/AnsiballZ_systemd.py
Feb 23 09:23:03 np0005626463.localdomain sudo[198773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:03 np0005626463.localdomain python3.9[198775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:04 np0005626463.localdomain sudo[198773]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:04 np0005626463.localdomain sudo[198886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxfnidoxhpsoosaopmadfmonuazgnchl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838584.7121768-1203-141483190153747/AnsiballZ_systemd.py
Feb 23 09:23:04 np0005626463.localdomain sudo[198886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:05 np0005626463.localdomain python3.9[198888]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:05 np0005626463.localdomain sudo[198886]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52918 DF PROTO=TCP SPT=52242 DPT=9101 SEQ=5356550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE9B860000000001030307) 
Feb 23 09:23:06 np0005626463.localdomain sudo[198999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhvplmvbnvpotcjawchlgmqjxusjsbwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838585.537214-1203-185977789921871/AnsiballZ_systemd.py
Feb 23 09:23:06 np0005626463.localdomain sudo[198999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:06 np0005626463.localdomain sshd[199002]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:23:06 np0005626463.localdomain python3.9[199001]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:06 np0005626463.localdomain sudo[198999]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:07 np0005626463.localdomain sshd[199002]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:23:07 np0005626463.localdomain sudo[199114]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqrkjtvijrqlxhmqbdbhyeygzjnxtmzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838586.9143798-1203-170825233922924/AnsiballZ_systemd.py
Feb 23 09:23:07 np0005626463.localdomain sudo[199114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:07 np0005626463.localdomain python3.9[199116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:07 np0005626463.localdomain sudo[199114]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:08 np0005626463.localdomain sudo[199227]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hncugxlgixpihuyebhfwjyvlhayrngwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838587.6903415-1203-46783827981033/AnsiballZ_systemd.py
Feb 23 09:23:08 np0005626463.localdomain sudo[199227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:08 np0005626463.localdomain python3.9[199229]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18811 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEA7C30000000001030307) 
Feb 23 09:23:09 np0005626463.localdomain sudo[199227]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:09 np0005626463.localdomain sudo[199340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxjagiiojmxegzvnuhcsymapanwiayck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838589.643889-1203-154652728463842/AnsiballZ_systemd.py
Feb 23 09:23:09 np0005626463.localdomain sudo[199340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:10 np0005626463.localdomain python3.9[199342]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:11 np0005626463.localdomain sudo[199340]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:11 np0005626463.localdomain sudo[199453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akfanzxhhsfevyghqbelcgvfejyozpnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838591.4435484-1203-14979983224909/AnsiballZ_systemd.py
Feb 23 09:23:11 np0005626463.localdomain sudo[199453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:12 np0005626463.localdomain python3.9[199455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18813 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEB3C60000000001030307) 
Feb 23 09:23:13 np0005626463.localdomain sudo[199453]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:13 np0005626463.localdomain sudo[199566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyxqefcdxdjpjzigkjgilnfmjiomrzoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838593.2352479-1203-117659910202967/AnsiballZ_systemd.py
Feb 23 09:23:13 np0005626463.localdomain sudo[199566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:13 np0005626463.localdomain python3.9[199568]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:14 np0005626463.localdomain sudo[199566]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49533 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEBE060000000001030307) 
Feb 23 09:23:15 np0005626463.localdomain sudo[199679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plxzmmtedmggbmhzjrdagtxjdqpegovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838595.0160723-1203-145463628698263/AnsiballZ_systemd.py
Feb 23 09:23:15 np0005626463.localdomain sudo[199679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:15 np0005626463.localdomain python3.9[199681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 23 09:23:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:23:15 np0005626463.localdomain sudo[199679]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:15 np0005626463.localdomain podman[199683]: 2026-02-23 09:23:15.716176991 +0000 UTC m=+0.089032893 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:23:15 np0005626463.localdomain podman[199683]: 2026-02-23 09:23:15.757248541 +0000 UTC m=+0.130104423 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 09:23:15 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:23:16 np0005626463.localdomain sudo[199818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kklgslvmznhzpofikbfmeachrotkopzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838596.075944-1509-140369764456139/AnsiballZ_file.py
Feb 23 09:23:16 np0005626463.localdomain sudo[199818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:17 np0005626463.localdomain python3.9[199820]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:17 np0005626463.localdomain sudo[199818]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:17 np0005626463.localdomain sudo[199928]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zimtathtpcjogsigoetahnprbgsozjwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838597.1944242-1509-155262301361856/AnsiballZ_file.py
Feb 23 09:23:17 np0005626463.localdomain sudo[199928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:17 np0005626463.localdomain python3.9[199930]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:17 np0005626463.localdomain sudo[199928]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:18 np0005626463.localdomain sudo[200038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qklcfljyivcfgrucleswqxygksfoqmdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838597.8222873-1509-162281035858302/AnsiballZ_file.py
Feb 23 09:23:18 np0005626463.localdomain sudo[200038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:18 np0005626463.localdomain python3.9[200040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:18 np0005626463.localdomain sudo[200038]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52920 DF PROTO=TCP SPT=52242 DPT=9101 SEQ=5356550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEECC060000000001030307) 
Feb 23 09:23:19 np0005626463.localdomain sudo[200148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmxbtbfmcrzfjlebpfcckjkxhcoxzdql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838598.8563848-1509-7978220175637/AnsiballZ_file.py
Feb 23 09:23:19 np0005626463.localdomain sudo[200148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:19 np0005626463.localdomain python3.9[200150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:19 np0005626463.localdomain sudo[200148]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:19 np0005626463.localdomain sudo[200151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:23:19 np0005626463.localdomain sudo[200151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:23:19 np0005626463.localdomain sudo[200151]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:19 np0005626463.localdomain sudo[200186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:23:19 np0005626463.localdomain sudo[200186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:23:19 np0005626463.localdomain sudo[200294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aprrakedgwpylbsizpgnzulxnkfhaobq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838599.4742227-1509-193699846430206/AnsiballZ_file.py
Feb 23 09:23:19 np0005626463.localdomain sudo[200294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:19 np0005626463.localdomain python3.9[200296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:19 np0005626463.localdomain sudo[200294]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:20 np0005626463.localdomain sudo[200186]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:20 np0005626463.localdomain sudo[200435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xatkdpqjcrkvaupgcflnjgyewlkhmokd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838600.0918822-1509-101695107769879/AnsiballZ_file.py
Feb 23 09:23:20 np0005626463.localdomain sudo[200435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:20 np0005626463.localdomain python3.9[200437]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:23:20 np0005626463.localdomain sudo[200435]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:20 np0005626463.localdomain sudo[200455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:23:20 np0005626463.localdomain sudo[200455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:23:20 np0005626463.localdomain sudo[200455]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:21 np0005626463.localdomain python3.9[200563]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:23:21 np0005626463.localdomain sudo[200671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlcamvffeliqnmizcvomosfhwedfeuld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838601.5410128-1662-47737091287228/AnsiballZ_stat.py
Feb 23 09:23:21 np0005626463.localdomain sudo[200671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:22 np0005626463.localdomain python3.9[200673]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:22 np0005626463.localdomain sudo[200671]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:22 np0005626463.localdomain sudo[200761]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfsryjpuegwiokbwhveryhncbhpgutkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838601.5410128-1662-47737091287228/AnsiballZ_copy.py
Feb 23 09:23:22 np0005626463.localdomain sudo[200761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:23:22 np0005626463.localdomain systemd[1]: tmp-crun.kLUHDd.mount: Deactivated successfully.
Feb 23 09:23:22 np0005626463.localdomain podman[200764]: 2026-02-23 09:23:22.90405021 +0000 UTC m=+0.104294219 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:23:22 np0005626463.localdomain podman[200764]: 2026-02-23 09:23:22.915233547 +0000 UTC m=+0.115477536 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 23 09:23:22 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:23:22 np0005626463.localdomain python3.9[200763]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838601.5410128-1662-47737091287228/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:23 np0005626463.localdomain sudo[200761]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:23 np0005626463.localdomain sudo[200887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pluaekemurkytaqteoiksvqtwopiwrzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838603.132028-1662-67851893940561/AnsiballZ_stat.py
Feb 23 09:23:23 np0005626463.localdomain sudo[200887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:23 np0005626463.localdomain python3.9[200889]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:23 np0005626463.localdomain sudo[200887]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:24 np0005626463.localdomain sudo[200977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxqzuezdapzfvwokvwlwogrjmxfyhhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838603.132028-1662-67851893940561/AnsiballZ_copy.py
Feb 23 09:23:24 np0005626463.localdomain sudo[200977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44016 DF PROTO=TCP SPT=34410 DPT=9102 SEQ=368432506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEE1400000000001030307) 
Feb 23 09:23:24 np0005626463.localdomain python3.9[200979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838603.132028-1662-67851893940561/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:24 np0005626463.localdomain sudo[200977]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:24 np0005626463.localdomain sudo[201087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bofofjmltybraqdpzmgmarbopeyjxqgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838604.4266741-1662-114599274264969/AnsiballZ_stat.py
Feb 23 09:23:24 np0005626463.localdomain sudo[201087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18815 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEE4070000000001030307) 
Feb 23 09:23:24 np0005626463.localdomain python3.9[201089]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:24 np0005626463.localdomain sudo[201087]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:25 np0005626463.localdomain sudo[201177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxyzsduqwnldjbckfhpxxmggqeiqyymd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838604.4266741-1662-114599274264969/AnsiballZ_copy.py
Feb 23 09:23:25 np0005626463.localdomain sudo[201177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:25 np0005626463.localdomain python3.9[201179]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838604.4266741-1662-114599274264969/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:25 np0005626463.localdomain sudo[201177]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:26 np0005626463.localdomain sudo[201287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oksekjrazvawqmcjdipvkgisxmqvidpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838605.8342266-1662-280878966173408/AnsiballZ_stat.py
Feb 23 09:23:26 np0005626463.localdomain sudo[201287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:26 np0005626463.localdomain python3.9[201289]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:26 np0005626463.localdomain sudo[201287]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:26 np0005626463.localdomain sudo[201377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rehqfskngvdhgagxsrizoucznwxelofk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838605.8342266-1662-280878966173408/AnsiballZ_copy.py
Feb 23 09:23:26 np0005626463.localdomain sudo[201377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:26 np0005626463.localdomain python3.9[201379]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838605.8342266-1662-280878966173408/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:26 np0005626463.localdomain sudo[201377]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44018 DF PROTO=TCP SPT=34410 DPT=9102 SEQ=368432506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEED460000000001030307) 
Feb 23 09:23:27 np0005626463.localdomain sudo[201487]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzwefoxalbrncxnezfatgrqknghbipcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838607.0073864-1662-145197105874780/AnsiballZ_stat.py
Feb 23 09:23:27 np0005626463.localdomain sudo[201487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:27 np0005626463.localdomain python3.9[201489]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:27 np0005626463.localdomain sudo[201487]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:27 np0005626463.localdomain sudo[201577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebjagumgqpzgkozglqeljgwwrifuepcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838607.0073864-1662-145197105874780/AnsiballZ_copy.py
Feb 23 09:23:27 np0005626463.localdomain sudo[201577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:27 np0005626463.localdomain sshd[201580]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:23:28 np0005626463.localdomain python3.9[201579]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838607.0073864-1662-145197105874780/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:28 np0005626463.localdomain sudo[201577]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:28 np0005626463.localdomain sshd[201580]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:23:28 np0005626463.localdomain sudo[201689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cggayqvzbbzycxpaquldytivwqbcwkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838608.6292882-1662-192434915117389/AnsiballZ_stat.py
Feb 23 09:23:28 np0005626463.localdomain sudo[201689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:29 np0005626463.localdomain python3.9[201691]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:29 np0005626463.localdomain sudo[201689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:29 np0005626463.localdomain sudo[201779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpavchejmyrmebgckaurulozxrxtqzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838608.6292882-1662-192434915117389/AnsiballZ_copy.py
Feb 23 09:23:29 np0005626463.localdomain sudo[201779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:29 np0005626463.localdomain python3.9[201781]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838608.6292882-1662-192434915117389/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:29 np0005626463.localdomain sudo[201779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:30 np0005626463.localdomain sudo[201889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoykwdkaxwhqmebckjmqdxloblbpzjwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838609.7980475-1662-49307972051331/AnsiballZ_stat.py
Feb 23 09:23:30 np0005626463.localdomain sudo[201889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:30 np0005626463.localdomain python3.9[201891]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:30 np0005626463.localdomain sudo[201889]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1707 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEFBC60000000001030307) 
Feb 23 09:23:31 np0005626463.localdomain sudo[201977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xixognyuntnmqxxjjdrvyhnujobmmmbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838609.7980475-1662-49307972051331/AnsiballZ_copy.py
Feb 23 09:23:31 np0005626463.localdomain sudo[201977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:31 np0005626463.localdomain python3.9[201979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838609.7980475-1662-49307972051331/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:31 np0005626463.localdomain sudo[201977]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:31 np0005626463.localdomain sudo[202087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhvcptyrvwqylyakapqnwzmadlnbmglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838611.3456676-1662-155137844101064/AnsiballZ_stat.py
Feb 23 09:23:31 np0005626463.localdomain sudo[202087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:31 np0005626463.localdomain python3.9[202089]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:31 np0005626463.localdomain sudo[202087]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:32 np0005626463.localdomain sudo[202177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngnjbdwtjdcsljzfsnnsuvecjowfchsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838611.3456676-1662-155137844101064/AnsiballZ_copy.py
Feb 23 09:23:32 np0005626463.localdomain sudo[202177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:32 np0005626463.localdomain python3.9[202179]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838611.3456676-1662-155137844101064/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:32 np0005626463.localdomain sudo[202177]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1708 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF03C60000000001030307) 
Feb 23 09:23:33 np0005626463.localdomain sudo[202287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftreayptmbuoxpqbbxrhgbgidgrfomro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838612.8113492-2004-268656522617184/AnsiballZ_file.py
Feb 23 09:23:33 np0005626463.localdomain sudo[202287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:33 np0005626463.localdomain python3.9[202289]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:33 np0005626463.localdomain sudo[202287]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:33 np0005626463.localdomain sudo[202397]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iijaznrryilncenraueuoyzafrlwvseo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838613.4584873-2028-4096804949401/AnsiballZ_file.py
Feb 23 09:23:33 np0005626463.localdomain sudo[202397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:33 np0005626463.localdomain python3.9[202399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:33 np0005626463.localdomain sudo[202397]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:34 np0005626463.localdomain sudo[202507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbjqhxrbyiajhjdazrwmqclppbgdjwhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838614.0739145-2028-190543486895660/AnsiballZ_file.py
Feb 23 09:23:34 np0005626463.localdomain sudo[202507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:34 np0005626463.localdomain python3.9[202509]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:34 np0005626463.localdomain sudo[202507]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:34 np0005626463.localdomain sudo[202617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsszjxonbxhfhvjsofhusfhspdxubclq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838614.6835706-2028-49027079671926/AnsiballZ_file.py
Feb 23 09:23:34 np0005626463.localdomain sudo[202617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:35 np0005626463.localdomain python3.9[202619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:35 np0005626463.localdomain sudo[202617]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:35 np0005626463.localdomain sudo[202727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nchhtrqgkmfndyvqcnddymozaeyomdxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838615.274507-2028-234071827834598/AnsiballZ_file.py
Feb 23 09:23:35 np0005626463.localdomain sudo[202727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:35 np0005626463.localdomain python3.9[202729]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:35 np0005626463.localdomain sudo[202727]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:36 np0005626463.localdomain sudo[202837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyqwkstieebocvkcwahbceobhbwjsvmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838615.8687518-2028-25932960684357/AnsiballZ_file.py
Feb 23 09:23:36 np0005626463.localdomain sudo[202837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38834 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=1059110977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF10860000000001030307) 
Feb 23 09:23:36 np0005626463.localdomain python3.9[202839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:36 np0005626463.localdomain sudo[202837]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:36 np0005626463.localdomain sudo[202947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buqthknrxasrchcwoexncrbcrrblrycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838616.4624135-2028-147556018274630/AnsiballZ_file.py
Feb 23 09:23:36 np0005626463.localdomain sudo[202947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:36 np0005626463.localdomain python3.9[202949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:36 np0005626463.localdomain sudo[202947]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:37 np0005626463.localdomain sudo[203057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlcxvpramevppztloljkaktpbjpsdcra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838617.0428822-2028-164462244155546/AnsiballZ_file.py
Feb 23 09:23:37 np0005626463.localdomain sudo[203057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:37 np0005626463.localdomain python3.9[203059]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:37 np0005626463.localdomain sudo[203057]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:37 np0005626463.localdomain sudo[203167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgxcjzdlarqccgvvrjlcmpdzlptnrsiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838617.660203-2028-207294951757117/AnsiballZ_file.py
Feb 23 09:23:37 np0005626463.localdomain sudo[203167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:38 np0005626463.localdomain python3.9[203169]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:38 np0005626463.localdomain sudo[203167]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:38 np0005626463.localdomain sudo[203277]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plbkmtbrgqpitqfsfishjjmodlmzncho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838618.2708313-2028-259610280697729/AnsiballZ_file.py
Feb 23 09:23:38 np0005626463.localdomain sudo[203277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:38 np0005626463.localdomain python3.9[203279]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:38 np0005626463.localdomain sudo[203277]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:39 np0005626463.localdomain sudo[203387]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsznxnujrdlvmrynruwutlfwtilovskw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838618.8919373-2028-257996280148854/AnsiballZ_file.py
Feb 23 09:23:39 np0005626463.localdomain sudo[203387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:39 np0005626463.localdomain python3.9[203389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:39 np0005626463.localdomain sudo[203387]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30480 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF1CF50000000001030307) 
Feb 23 09:23:39 np0005626463.localdomain sudo[203497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aufokxvbkpncbcpaibnglbplwebfkfyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838619.6656573-2028-273669073640321/AnsiballZ_file.py
Feb 23 09:23:39 np0005626463.localdomain sudo[203497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:40 np0005626463.localdomain python3.9[203499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:40 np0005626463.localdomain sudo[203497]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:40 np0005626463.localdomain sudo[203607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qahkmaxjkjkogcckkxlxryzpsyemqfxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838620.6453476-2028-220722581770310/AnsiballZ_file.py
Feb 23 09:23:40 np0005626463.localdomain sudo[203607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:41 np0005626463.localdomain python3.9[203609]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:41 np0005626463.localdomain sudo[203607]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:41 np0005626463.localdomain sudo[203717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzjyuwrjwgcxhjnqrrxrffryoehxhcjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838621.2233694-2028-155189346863218/AnsiballZ_file.py
Feb 23 09:23:41 np0005626463.localdomain sudo[203717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:41 np0005626463.localdomain python3.9[203719]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:41 np0005626463.localdomain sudo[203717]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30482 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF29060000000001030307) 
Feb 23 09:23:42 np0005626463.localdomain sudo[203827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aacxbywoihxtzhwrcakznrigjwbsuwxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838621.84653-2028-248578824230724/AnsiballZ_file.py
Feb 23 09:23:42 np0005626463.localdomain sudo[203827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:43 np0005626463.localdomain python3.9[203829]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:43 np0005626463.localdomain sudo[203827]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:43 np0005626463.localdomain sudo[203937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtjzkepjiwzotbrbjmdapiwfoeiluptn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838623.462256-2325-214771761873346/AnsiballZ_stat.py
Feb 23 09:23:43 np0005626463.localdomain sudo[203937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:43 np0005626463.localdomain python3.9[203939]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:43 np0005626463.localdomain sudo[203937]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:44 np0005626463.localdomain sudo[204025]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfjgucykzgghoxhejrctchyzwmqlzprk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838623.462256-2325-214771761873346/AnsiballZ_copy.py
Feb 23 09:23:44 np0005626463.localdomain sudo[204025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:44 np0005626463.localdomain python3.9[204027]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838623.462256-2325-214771761873346/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:44 np0005626463.localdomain sudo[204025]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:44 np0005626463.localdomain sudo[204135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avgohxgqkvcuhmwlirxygetgpkgutfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838624.6299677-2325-10760413131730/AnsiballZ_stat.py
Feb 23 09:23:44 np0005626463.localdomain sudo[204135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:44 np0005626463.localdomain sshd[204138]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:23:45 np0005626463.localdomain python3.9[204137]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:45 np0005626463.localdomain sudo[204135]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:45 np0005626463.localdomain sshd[204138]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:23:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1710 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF34060000000001030307) 
Feb 23 09:23:45 np0005626463.localdomain sudo[204225]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtexyhrxueynrdvobgcfsxlnlkgrhhrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838624.6299677-2325-10760413131730/AnsiballZ_copy.py
Feb 23 09:23:45 np0005626463.localdomain sudo[204225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:45 np0005626463.localdomain python3.9[204227]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838624.6299677-2325-10760413131730/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:45 np0005626463.localdomain sudo[204225]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:23:45 np0005626463.localdomain podman[204245]: 2026-02-23 09:23:45.917966369 +0000 UTC m=+0.086633982 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 09:23:45 np0005626463.localdomain systemd[1]: tmp-crun.48Y5Wu.mount: Deactivated successfully.
Feb 23 09:23:45 np0005626463.localdomain podman[204245]: 2026-02-23 09:23:45.965567931 +0000 UTC m=+0.134235524 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 23 09:23:45 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:23:46 np0005626463.localdomain sudo[204360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fytazampurjkacjnuiqsdtdtidsjnfjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838625.8519723-2325-278676601107263/AnsiballZ_stat.py
Feb 23 09:23:46 np0005626463.localdomain sudo[204360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:46 np0005626463.localdomain python3.9[204362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:46 np0005626463.localdomain sudo[204360]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:46 np0005626463.localdomain sudo[204448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bousptuvkpetdhwqkvfpujslccppzadi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838625.8519723-2325-278676601107263/AnsiballZ_copy.py
Feb 23 09:23:46 np0005626463.localdomain sudo[204448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:46 np0005626463.localdomain python3.9[204450]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838625.8519723-2325-278676601107263/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:46 np0005626463.localdomain sudo[204448]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:47 np0005626463.localdomain sudo[204558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxeeztvlrstnqcfmraoewjrftfvliulg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838626.9844291-2325-2824619806686/AnsiballZ_stat.py
Feb 23 09:23:47 np0005626463.localdomain sudo[204558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:47 np0005626463.localdomain python3.9[204560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:47 np0005626463.localdomain sudo[204558]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:47 np0005626463.localdomain sudo[204646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whbitgirfcbtlwtdddyuupfquzhmnxgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838626.9844291-2325-2824619806686/AnsiballZ_copy.py
Feb 23 09:23:47 np0005626463.localdomain sudo[204646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:47 np0005626463.localdomain python3.9[204648]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838626.9844291-2325-2824619806686/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:47 np0005626463.localdomain sudo[204646]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38836 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=1059110977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF40060000000001030307) 
Feb 23 09:23:48 np0005626463.localdomain sudo[204756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-batxdqpevzefeuotrhjernjvpbazvxua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838628.1109343-2325-279520773670372/AnsiballZ_stat.py
Feb 23 09:23:48 np0005626463.localdomain sudo[204756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:23:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:23:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:23:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:23:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:23:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:23:48 np0005626463.localdomain python3.9[204758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:48 np0005626463.localdomain sudo[204756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:48 np0005626463.localdomain sudo[204844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykvcavqceufxlqpwqcazedszqjpidewi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838628.1109343-2325-279520773670372/AnsiballZ_copy.py
Feb 23 09:23:48 np0005626463.localdomain sudo[204844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:49 np0005626463.localdomain python3.9[204846]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838628.1109343-2325-279520773670372/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:49 np0005626463.localdomain sudo[204844]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:49 np0005626463.localdomain sudo[204954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyiiihxypogzwuasojvnqddxhdigqhjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838629.3342123-2325-71050521864360/AnsiballZ_stat.py
Feb 23 09:23:49 np0005626463.localdomain sudo[204954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:49 np0005626463.localdomain python3.9[204956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:49 np0005626463.localdomain sudo[204954]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:50 np0005626463.localdomain sudo[205042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujvromftztcvyhxgnruddlmhnukjwafc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838629.3342123-2325-71050521864360/AnsiballZ_copy.py
Feb 23 09:23:50 np0005626463.localdomain sudo[205042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:50 np0005626463.localdomain python3.9[205044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838629.3342123-2325-71050521864360/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:50 np0005626463.localdomain sudo[205042]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:50 np0005626463.localdomain sudo[205152]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kozfqdphtxsdmyxjkgyfwsqestdxrzjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838630.5485175-2325-277528593880385/AnsiballZ_stat.py
Feb 23 09:23:50 np0005626463.localdomain sudo[205152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:51 np0005626463.localdomain python3.9[205154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:51 np0005626463.localdomain sudo[205152]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:51 np0005626463.localdomain sudo[205240]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qourhusbhjwcvgyhlmzvirmzyfukcvnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838630.5485175-2325-277528593880385/AnsiballZ_copy.py
Feb 23 09:23:51 np0005626463.localdomain sudo[205240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:51 np0005626463.localdomain python3.9[205242]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838630.5485175-2325-277528593880385/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:51 np0005626463.localdomain sudo[205240]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:52 np0005626463.localdomain sudo[205350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykvefmrilqaylnuqcctaoivhfnvtpmij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838631.7737117-2325-91894078544770/AnsiballZ_stat.py
Feb 23 09:23:52 np0005626463.localdomain sudo[205350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:52 np0005626463.localdomain python3.9[205352]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:52 np0005626463.localdomain sudo[205350]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:53 np0005626463.localdomain sudo[205438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lppggcggkvbxjxpdmzhemavwuuwymzrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838631.7737117-2325-91894078544770/AnsiballZ_copy.py
Feb 23 09:23:53 np0005626463.localdomain sudo[205438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:23:53 np0005626463.localdomain podman[205441]: 2026-02-23 09:23:53.381722553 +0000 UTC m=+0.098914323 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:23:53 np0005626463.localdomain podman[205441]: 2026-02-23 09:23:53.393176784 +0000 UTC m=+0.110368544 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 23 09:23:53 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:23:53 np0005626463.localdomain python3.9[205440]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838631.7737117-2325-91894078544770/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:53 np0005626463.localdomain sudo[205438]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:53 np0005626463.localdomain sudo[205565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxedohxycgaktzeqrkbnyvqbrpvbuujw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838633.6122322-2325-1913487218414/AnsiballZ_stat.py
Feb 23 09:23:53 np0005626463.localdomain sudo[205565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25892 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF566F0000000001030307) 
Feb 23 09:23:54 np0005626463.localdomain python3.9[205567]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:54 np0005626463.localdomain sudo[205565]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30484 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF5A070000000001030307) 
Feb 23 09:23:55 np0005626463.localdomain sudo[205653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpeupujsedlsbiiozobjnwfyylwhsmrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838633.6122322-2325-1913487218414/AnsiballZ_copy.py
Feb 23 09:23:55 np0005626463.localdomain sudo[205653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:55 np0005626463.localdomain python3.9[205655]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838633.6122322-2325-1913487218414/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:55 np0005626463.localdomain sudo[205653]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:55 np0005626463.localdomain sudo[205763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmfsnziivgekytnmbbzbgndbcpexougb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838635.4920456-2325-134004433001527/AnsiballZ_stat.py
Feb 23 09:23:55 np0005626463.localdomain sudo[205763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:55 np0005626463.localdomain python3.9[205765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:56 np0005626463.localdomain sudo[205763]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:56 np0005626463.localdomain sudo[205851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbiaraxuyjzsqhdmhjmgwisnptiitnjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838635.4920456-2325-134004433001527/AnsiballZ_copy.py
Feb 23 09:23:56 np0005626463.localdomain sudo[205851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:56 np0005626463.localdomain python3.9[205853]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838635.4920456-2325-134004433001527/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:56 np0005626463.localdomain sudo[205851]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:56 np0005626463.localdomain sudo[205961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmngocvlivonwfxoodanheemqefhsjlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838636.6617522-2325-205131813969326/AnsiballZ_stat.py
Feb 23 09:23:56 np0005626463.localdomain sudo[205961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:57 np0005626463.localdomain python3.9[205963]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25894 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF62860000000001030307) 
Feb 23 09:23:57 np0005626463.localdomain sudo[205961]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:57 np0005626463.localdomain sudo[206049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lynoqgbqauznfvybhyebyxrfmumbunye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838636.6617522-2325-205131813969326/AnsiballZ_copy.py
Feb 23 09:23:57 np0005626463.localdomain sudo[206049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:57 np0005626463.localdomain python3.9[206051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838636.6617522-2325-205131813969326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:57 np0005626463.localdomain sudo[206049]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:58 np0005626463.localdomain sudo[206159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znrfffioplrcoyeghyxjtusdbmvwuwve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838637.865976-2325-59996288339509/AnsiballZ_stat.py
Feb 23 09:23:58 np0005626463.localdomain sudo[206159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:58 np0005626463.localdomain python3.9[206161]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:58 np0005626463.localdomain sudo[206159]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:58 np0005626463.localdomain sudo[206247]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uquuqrdsoxmdoybqobhwaqmkusixcctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838637.865976-2325-59996288339509/AnsiballZ_copy.py
Feb 23 09:23:58 np0005626463.localdomain sudo[206247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:59 np0005626463.localdomain python3.9[206249]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838637.865976-2325-59996288339509/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:23:59 np0005626463.localdomain sudo[206247]: pam_unix(sudo:session): session closed for user root
Feb 23 09:23:59 np0005626463.localdomain sudo[206357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmbtzbalpwkimofoczgaezqmheyqlchg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838639.2778425-2325-23532228047330/AnsiballZ_stat.py
Feb 23 09:23:59 np0005626463.localdomain sudo[206357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:23:59 np0005626463.localdomain python3.9[206359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:23:59 np0005626463.localdomain sudo[206357]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:00 np0005626463.localdomain sudo[206445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvskbnwvzetzbkbmhfpiykpsrvwuinrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838639.2778425-2325-23532228047330/AnsiballZ_copy.py
Feb 23 09:24:00 np0005626463.localdomain sudo[206445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:00 np0005626463.localdomain python3.9[206447]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838639.2778425-2325-23532228047330/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:00 np0005626463.localdomain sudo[206445]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:00 np0005626463.localdomain sudo[206555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moekspogzsdjfadhtuwepkknfzcacprk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838640.3888333-2325-42499097915428/AnsiballZ_stat.py
Feb 23 09:24:00 np0005626463.localdomain sudo[206555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:00 np0005626463.localdomain python3.9[206557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:00 np0005626463.localdomain sudo[206555]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63804 DF PROTO=TCP SPT=49832 DPT=9100 SEQ=3024133794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF71060000000001030307) 
Feb 23 09:24:01 np0005626463.localdomain sudo[206643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srlbgdgdkpvuhttlsspngcxiqtkrnqzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838640.3888333-2325-42499097915428/AnsiballZ_copy.py
Feb 23 09:24:01 np0005626463.localdomain sudo[206643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:01 np0005626463.localdomain python3.9[206645]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838640.3888333-2325-42499097915428/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:01 np0005626463.localdomain sudo[206643]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:02 np0005626463.localdomain python3.9[206753]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63805 DF PROTO=TCP SPT=49832 DPT=9100 SEQ=3024133794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF79060000000001030307) 
Feb 23 09:24:02 np0005626463.localdomain sudo[206864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsjhakiusggmpwdxbrnzqprvjgmnpkgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838642.4038086-2943-15241589913101/AnsiballZ_seboolean.py
Feb 23 09:24:02 np0005626463.localdomain sudo[206864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:03 np0005626463.localdomain python3.9[206866]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 23 09:24:03 np0005626463.localdomain sudo[206864]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:04 np0005626463.localdomain sudo[206974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymhrppsydombvnmybwcudujplrowofse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838643.8102608-2973-236421516187741/AnsiballZ_systemd.py
Feb 23 09:24:04 np0005626463.localdomain sudo[206974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:04 np0005626463.localdomain python3.9[206976]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:04 np0005626463.localdomain systemd-rc-local-generator[207001]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:04 np0005626463.localdomain systemd-sysv-generator[207006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Starting libvirt logging daemon socket...
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Starting libvirt logging daemon...
Feb 23 09:24:04 np0005626463.localdomain systemd[1]: Started libvirt logging daemon.
Feb 23 09:24:04 np0005626463.localdomain sudo[206974]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:05 np0005626463.localdomain sudo[207126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iivdshqaukrkvcubnbanainulcyoskjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838645.5230014-2973-170538711388126/AnsiballZ_systemd.py
Feb 23 09:24:05 np0005626463.localdomain sudo[207126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:06 np0005626463.localdomain python3.9[207128]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:06 np0005626463.localdomain systemd-rc-local-generator[207152]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43480 DF PROTO=TCP SPT=49158 DPT=9101 SEQ=3915859774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF85C60000000001030307) 
Feb 23 09:24:06 np0005626463.localdomain systemd-sysv-generator[207155]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 23 09:24:06 np0005626463.localdomain sudo[207126]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 23 09:24:06 np0005626463.localdomain setroubleshoot[207166]: Deleting alert a176daef-12e3-44f0-9641-bedf749d0981, it is allowed in current policy
Feb 23 09:24:06 np0005626463.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Feb 23 09:24:06 np0005626463.localdomain sudo[207309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lurgejpnnxlhqdcmltmwfmwzmokpspxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838646.6363971-2973-138466840097081/AnsiballZ_systemd.py
Feb 23 09:24:06 np0005626463.localdomain sudo[207309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:07 np0005626463.localdomain python3.9[207311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:07 np0005626463.localdomain systemd-sysv-generator[207343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:07 np0005626463.localdomain systemd-rc-local-generator[207338]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 23 09:24:07 np0005626463.localdomain systemd[1]: Started libvirt proxy daemon.
Feb 23 09:24:07 np0005626463.localdomain sudo[207309]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:07 np0005626463.localdomain setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 217d9264-3142-4991-af77-108d418c8753
Feb 23 09:24:07 np0005626463.localdomain setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 23 09:24:07 np0005626463.localdomain setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 217d9264-3142-4991-af77-108d418c8753
Feb 23 09:24:07 np0005626463.localdomain setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 23 09:24:07 np0005626463.localdomain sshd[207453]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:24:08 np0005626463.localdomain sudo[207485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxeseqbpbdghjaodzakkvdwjdsminzau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838647.7480574-2973-174523200420253/AnsiballZ_systemd.py
Feb 23 09:24:08 np0005626463.localdomain sudo[207485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:08 np0005626463.localdomain python3.9[207487]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:08 np0005626463.localdomain sshd[207453]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:24:08 np0005626463.localdomain systemd-sysv-generator[207517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:08 np0005626463.localdomain systemd-rc-local-generator[207511]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 23 09:24:08 np0005626463.localdomain systemd[1]: Started libvirt QEMU daemon.
Feb 23 09:24:08 np0005626463.localdomain sudo[207485]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:09 np0005626463.localdomain sudo[207667]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivwznautxvzglwfzmlhzsxjggbjmmhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838648.8355963-2973-37769407803002/AnsiballZ_systemd.py
Feb 23 09:24:09 np0005626463.localdomain sudo[207667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25896 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF92070000000001030307) 
Feb 23 09:24:09 np0005626463.localdomain python3.9[207669]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:09 np0005626463.localdomain systemd-rc-local-generator[207702]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:09 np0005626463.localdomain systemd-sysv-generator[207709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Starting libvirt secret daemon socket...
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 23 09:24:09 np0005626463.localdomain systemd[1]: Started libvirt secret daemon.
Feb 23 09:24:09 np0005626463.localdomain sudo[207667]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:10 np0005626463.localdomain sudo[207850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cykiimknkweomjrodoiasrzykevitbdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838650.649428-3084-23848099047849/AnsiballZ_file.py
Feb 23 09:24:10 np0005626463.localdomain sudo[207850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:11 np0005626463.localdomain python3.9[207852]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:11 np0005626463.localdomain sudo[207850]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:11 np0005626463.localdomain sudo[207960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqffdygoozwuhnknemculelztycmxeve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838651.3059254-3108-259802387567293/AnsiballZ_find.py
Feb 23 09:24:11 np0005626463.localdomain sudo[207960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:11 np0005626463.localdomain python3.9[207962]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:24:11 np0005626463.localdomain sudo[207960]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:12 np0005626463.localdomain sudo[208070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khgfalyhswopaeglozjpbplbmojlcjku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838651.9705153-3132-268865048801774/AnsiballZ_command.py
Feb 23 09:24:12 np0005626463.localdomain sudo[208070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:12 np0005626463.localdomain python3.9[208072]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:12 np0005626463.localdomain sudo[208070]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12732 DF PROTO=TCP SPT=41654 DPT=9882 SEQ=4265702725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF9E460000000001030307) 
Feb 23 09:24:13 np0005626463.localdomain python3.9[208184]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:24:14 np0005626463.localdomain python3.9[208292]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:14 np0005626463.localdomain python3.9[208378]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838653.633911-3189-164760666317036/.source.xml follow=False _original_basename=secret.xml.j2 checksum=9110e86c46036bf6b9c9b3a9e049196c9a537971 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10338 DF PROTO=TCP SPT=40756 DPT=9105 SEQ=3078679058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFA8060000000001030307) 
Feb 23 09:24:15 np0005626463.localdomain sudo[208486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjpabmaclacngnutglubvxptgvphqprx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838654.785835-3234-104494493635148/AnsiballZ_command.py
Feb 23 09:24:15 np0005626463.localdomain sudo[208486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:15 np0005626463.localdomain python3.9[208488]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:15 np0005626463.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:208490:1009545 (system bus name :1.2848 [pkttyagent --process 208490 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 23 09:24:15 np0005626463.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:208490:1009545 (system bus name :1.2848, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 23 09:24:15 np0005626463.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:208489:1009544 (system bus name :1.2849 [pkttyagent --process 208489 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 23 09:24:15 np0005626463.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:208489:1009544 (system bus name :1.2849, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 23 09:24:15 np0005626463.localdomain sudo[208486]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:16 np0005626463.localdomain python3.9[208608]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:16 np0005626463.localdomain sudo[208716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgyiievjdtucszlethydszjumassrnvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838656.2877345-3282-72873742804712/AnsiballZ_command.py
Feb 23 09:24:16 np0005626463.localdomain sudo[208716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:24:16 np0005626463.localdomain podman[208719]: 2026-02-23 09:24:16.672753632 +0000 UTC m=+0.089666714 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:24:16 np0005626463.localdomain podman[208719]: 2026-02-23 09:24:16.71022013 +0000 UTC m=+0.127133192 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.license=GPLv2)
Feb 23 09:24:16 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:24:16 np0005626463.localdomain sudo[208716]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:17 np0005626463.localdomain sudo[208853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krupplcywdfdzdxhkzgiunjndhqdehvd ; FSID=f1fea371-cb69-578d-a3d0-b5c472a84b46 KEY=AQA8BJxpAAAAABAADfNKfItRio8Io0CCzlDCWQ== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838656.9696527-3306-174615409757858/AnsiballZ_command.py
Feb 23 09:24:17 np0005626463.localdomain sudo[208853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:17 np0005626463.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:208856:1009760 (system bus name :1.2852 [pkttyagent --process 208856 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 23 09:24:17 np0005626463.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:208856:1009760 (system bus name :1.2852, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 23 09:24:17 np0005626463.localdomain sudo[208853]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:17 np0005626463.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Feb 23 09:24:17 np0005626463.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 23 09:24:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43482 DF PROTO=TCP SPT=49158 DPT=9101 SEQ=3915859774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFB6060000000001030307) 
Feb 23 09:24:18 np0005626463.localdomain sudo[208969]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruunranbfbnnetfuuewccgbxmuqbvsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838658.6565595-3330-51421973866140/AnsiballZ_copy.py
Feb 23 09:24:18 np0005626463.localdomain sudo[208969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:19 np0005626463.localdomain python3.9[208971]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:19 np0005626463.localdomain sudo[208969]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:19 np0005626463.localdomain sudo[209079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxazjkdohrjuxbolqqyioukvestdsibr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838659.4437988-3354-271272366115848/AnsiballZ_stat.py
Feb 23 09:24:19 np0005626463.localdomain sudo[209079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:19 np0005626463.localdomain python3.9[209081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:19 np0005626463.localdomain sudo[209079]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:20 np0005626463.localdomain sudo[209167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojpfxbpwiaxbwvdmhpyvtdqidjptdjyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838659.4437988-3354-271272366115848/AnsiballZ_copy.py
Feb 23 09:24:20 np0005626463.localdomain sudo[209167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:20 np0005626463.localdomain python3.9[209169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838659.4437988-3354-271272366115848/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:20 np0005626463.localdomain sudo[209170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:24:20 np0005626463.localdomain sudo[209170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:24:20 np0005626463.localdomain sudo[209170]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:20 np0005626463.localdomain sudo[209167]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:21 np0005626463.localdomain sudo[209194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:24:21 np0005626463.localdomain sudo[209194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:24:21 np0005626463.localdomain sshd[209223]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:24:21 np0005626463.localdomain sudo[209354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nacojqttbvfdvqyhnobdgpwgiebnutqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838661.2872503-3402-137325430780223/AnsiballZ_file.py
Feb 23 09:24:21 np0005626463.localdomain sudo[209354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:21 np0005626463.localdomain sshd[209223]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:24:21 np0005626463.localdomain python3.9[209358]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:21 np0005626463.localdomain sudo[209354]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:21 np0005626463.localdomain systemd[1]: tmp-crun.iyeqxf.mount: Deactivated successfully.
Feb 23 09:24:21 np0005626463.localdomain podman[209389]: 2026-02-23 09:24:21.897135662 +0000 UTC m=+0.098747059 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2)
Feb 23 09:24:22 np0005626463.localdomain podman[209389]: 2026-02-23 09:24:22.023433612 +0000 UTC m=+0.225044999 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:24:22 np0005626463.localdomain sudo[209194]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:22 np0005626463.localdomain sudo[209564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggovfkjrnpzttjhxrdjtdwomijvebiol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838662.0449011-3426-196838351008598/AnsiballZ_stat.py
Feb 23 09:24:22 np0005626463.localdomain sudo[209564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:22 np0005626463.localdomain sudo[209567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:24:22 np0005626463.localdomain sudo[209567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:24:22 np0005626463.localdomain sudo[209567]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:22 np0005626463.localdomain sudo[209585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:24:22 np0005626463.localdomain sudo[209585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:24:22 np0005626463.localdomain python3.9[209566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:22 np0005626463.localdomain sudo[209564]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:22 np0005626463.localdomain sudo[209672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmeqoklawiwqzjterpxirqsbthufzofp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838662.0449011-3426-196838351008598/AnsiballZ_file.py
Feb 23 09:24:22 np0005626463.localdomain sudo[209672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:23 np0005626463.localdomain python3.9[209674]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:23 np0005626463.localdomain sudo[209672]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:23 np0005626463.localdomain sudo[209585]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:23 np0005626463.localdomain sudo[209799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lokybpipeasciyiomlnvgpfvuxmnildr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838663.2350152-3462-50979802928269/AnsiballZ_stat.py
Feb 23 09:24:23 np0005626463.localdomain sudo[209799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:24:23 np0005626463.localdomain sudo[209803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:24:23 np0005626463.localdomain sudo[209803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:24:23 np0005626463.localdomain sudo[209803]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:23 np0005626463.localdomain podman[209802]: 2026-02-23 09:24:23.622263636 +0000 UTC m=+0.087095873 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:24:23 np0005626463.localdomain podman[209802]: 2026-02-23 09:24:23.628758819 +0000 UTC m=+0.093591126 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:24:23 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:24:23 np0005626463.localdomain python3.9[209801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:23 np0005626463.localdomain sudo[209799]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:23 np0005626463.localdomain sudo[209891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptioifrnwgumbtwnrjwhqhhekijnvsee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838663.2350152-3462-50979802928269/AnsiballZ_file.py
Feb 23 09:24:23 np0005626463.localdomain sudo[209891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12312 DF PROTO=TCP SPT=41600 DPT=9102 SEQ=1440645661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFCBA00000000001030307) 
Feb 23 09:24:24 np0005626463.localdomain python3.9[209893]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qssxrmrn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:24 np0005626463.localdomain sudo[209891]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:24 np0005626463.localdomain sudo[210001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpfwalzqnzhylvcasdhczsjqaayzamge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838664.3889768-3498-76742465902313/AnsiballZ_stat.py
Feb 23 09:24:24 np0005626463.localdomain sudo[210001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12734 DF PROTO=TCP SPT=41654 DPT=9882 SEQ=4265702725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFCE060000000001030307) 
Feb 23 09:24:24 np0005626463.localdomain python3.9[210003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:24 np0005626463.localdomain sudo[210001]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:25 np0005626463.localdomain sudo[210058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bybdgwaxxouoxwsayxqvzhbeqlidtjld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838664.3889768-3498-76742465902313/AnsiballZ_file.py
Feb 23 09:24:25 np0005626463.localdomain sudo[210058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:25 np0005626463.localdomain python3.9[210060]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:25 np0005626463.localdomain sudo[210058]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:25 np0005626463.localdomain sudo[210168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktmakkrdbnpyixlqqacvchrjstftslxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838665.5757468-3537-186658356501851/AnsiballZ_command.py
Feb 23 09:24:25 np0005626463.localdomain sudo[210168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:26 np0005626463.localdomain python3.9[210170]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:26 np0005626463.localdomain sudo[210168]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:26 np0005626463.localdomain sudo[210279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfqbqvfpvcqhnejxwlqppogxbmqeoadv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838666.3010476-3561-184430005784257/AnsiballZ_edpm_nftables_from_files.py
Feb 23 09:24:26 np0005626463.localdomain sudo[210279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:26 np0005626463.localdomain python3[210281]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 09:24:26 np0005626463.localdomain sudo[210279]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12314 DF PROTO=TCP SPT=41600 DPT=9102 SEQ=1440645661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFD7C70000000001030307) 
Feb 23 09:24:27 np0005626463.localdomain sudo[210389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srxfzrxfisyrpgabsdbyevoiqunvzazk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838667.1866446-3585-152488355889843/AnsiballZ_stat.py
Feb 23 09:24:27 np0005626463.localdomain sudo[210389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:27 np0005626463.localdomain python3.9[210391]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:27 np0005626463.localdomain sudo[210389]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:27 np0005626463.localdomain sudo[210446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfkunlzlvztmteatzshddnddblomjdga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838667.1866446-3585-152488355889843/AnsiballZ_file.py
Feb 23 09:24:27 np0005626463.localdomain sudo[210446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:28 np0005626463.localdomain python3.9[210448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:28 np0005626463.localdomain sudo[210446]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:28 np0005626463.localdomain sudo[210556]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouuzrigbvmpostynvgztyyaqwhghtyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838668.3358357-3621-160336285168455/AnsiballZ_stat.py
Feb 23 09:24:28 np0005626463.localdomain sudo[210556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:28 np0005626463.localdomain python3.9[210558]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:28 np0005626463.localdomain sudo[210556]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:29 np0005626463.localdomain sudo[210646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkhushaazmyyvodpnoorjivbxhdrjend ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838668.3358357-3621-160336285168455/AnsiballZ_copy.py
Feb 23 09:24:29 np0005626463.localdomain sudo[210646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:29 np0005626463.localdomain python3.9[210648]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838668.3358357-3621-160336285168455/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:29 np0005626463.localdomain sudo[210646]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:29 np0005626463.localdomain sudo[210756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhntlxgsinwyoggypgnahvmozjpyjkmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838669.6499789-3666-180406082458148/AnsiballZ_stat.py
Feb 23 09:24:29 np0005626463.localdomain sudo[210756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:30 np0005626463.localdomain python3.9[210758]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:30 np0005626463.localdomain sudo[210756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:30 np0005626463.localdomain sudo[210813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aezupcrxvxzxhvkazpbwzgursbmckybp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838669.6499789-3666-180406082458148/AnsiballZ_file.py
Feb 23 09:24:30 np0005626463.localdomain sudo[210813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:30 np0005626463.localdomain python3.9[210815]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:30 np0005626463.localdomain sudo[210813]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6740 DF PROTO=TCP SPT=39358 DPT=9100 SEQ=3465392032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFE6060000000001030307) 
Feb 23 09:24:31 np0005626463.localdomain sudo[210923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfrdmxdoydlvrvfkatswbqselrohvasz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838670.8364277-3702-156014311604915/AnsiballZ_stat.py
Feb 23 09:24:31 np0005626463.localdomain sudo[210923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:31 np0005626463.localdomain python3.9[210925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:31 np0005626463.localdomain sudo[210923]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:31 np0005626463.localdomain sudo[210980]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xescbuvyeppwihhafteaqdkwjzrfhttv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838670.8364277-3702-156014311604915/AnsiballZ_file.py
Feb 23 09:24:31 np0005626463.localdomain sudo[210980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:31 np0005626463.localdomain python3.9[210982]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:31 np0005626463.localdomain sudo[210980]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:32 np0005626463.localdomain sudo[211090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oercxvstdjujcuovtmbgjvlzmupbkuxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838672.0535352-3738-176109576668582/AnsiballZ_stat.py
Feb 23 09:24:32 np0005626463.localdomain sudo[211090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:32 np0005626463.localdomain python3.9[211092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:32 np0005626463.localdomain sudo[211090]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6741 DF PROTO=TCP SPT=39358 DPT=9100 SEQ=3465392032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFEE060000000001030307) 
Feb 23 09:24:33 np0005626463.localdomain sudo[211180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dehgnczygooffcqflmwgjxoxyfhkpltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838672.0535352-3738-176109576668582/AnsiballZ_copy.py
Feb 23 09:24:33 np0005626463.localdomain sudo[211180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:33 np0005626463.localdomain python3.9[211182]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838672.0535352-3738-176109576668582/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:33 np0005626463.localdomain sudo[211180]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:33 np0005626463.localdomain sudo[211290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-douhtpapwerwdcmxwrfvofbrmbqrogsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838673.5399148-3783-156148424736890/AnsiballZ_file.py
Feb 23 09:24:33 np0005626463.localdomain sudo[211290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:34 np0005626463.localdomain python3.9[211292]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:34 np0005626463.localdomain sudo[211290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:34 np0005626463.localdomain sudo[211400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfnzthwsyyehjlwcyakxzcxxwpazmlox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838674.1877213-3807-187273938510143/AnsiballZ_command.py
Feb 23 09:24:34 np0005626463.localdomain sudo[211400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:34 np0005626463.localdomain python3.9[211402]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:34 np0005626463.localdomain sudo[211400]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:35 np0005626463.localdomain sudo[211513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nstvqtmhkqfshfyqgpxafsuzwpkkhgtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838674.874522-3831-84701795539367/AnsiballZ_blockinfile.py
Feb 23 09:24:35 np0005626463.localdomain sudo[211513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:35 np0005626463.localdomain python3.9[211515]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:35 np0005626463.localdomain sudo[211513]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:36 np0005626463.localdomain sudo[211623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwhvdxnadkelrduaoyusmyoiwwtbfnfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838675.8463411-3858-241542188901763/AnsiballZ_command.py
Feb 23 09:24:36 np0005626463.localdomain sudo[211623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18593 DF PROTO=TCP SPT=56764 DPT=9101 SEQ=2878013049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFFB060000000001030307) 
Feb 23 09:24:36 np0005626463.localdomain python3.9[211625]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:36 np0005626463.localdomain sudo[211623]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:36 np0005626463.localdomain sudo[211734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aydrtxvybwmyfuitzosreuxphtomvrta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838676.5193748-3882-4068033274197/AnsiballZ_stat.py
Feb 23 09:24:36 np0005626463.localdomain sudo[211734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:36 np0005626463.localdomain python3.9[211736]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:24:37 np0005626463.localdomain sudo[211734]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:37 np0005626463.localdomain sudo[211846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slmyjycrvoqzcjlssrvsjrizfowyqegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838677.2059119-3906-260833259808691/AnsiballZ_command.py
Feb 23 09:24:37 np0005626463.localdomain sudo[211846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:37 np0005626463.localdomain python3.9[211848]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:24:37 np0005626463.localdomain sudo[211846]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:38 np0005626463.localdomain sudo[211959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apulkvpxwcjpkzolfeinblgqpveokvcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838678.6506822-3930-80307391556598/AnsiballZ_file.py
Feb 23 09:24:38 np0005626463.localdomain sudo[211959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:39 np0005626463.localdomain python3.9[211961]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:39 np0005626463.localdomain sudo[211959]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52456 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0075B0000000001030307) 
Feb 23 09:24:39 np0005626463.localdomain sudo[212069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrpezmfwyrvbrubddatnbmsamashmefv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838679.3368304-3954-98544778554840/AnsiballZ_stat.py
Feb 23 09:24:39 np0005626463.localdomain sudo[212069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:39 np0005626463.localdomain python3.9[212071]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:39 np0005626463.localdomain sudo[212069]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:40 np0005626463.localdomain sudo[212157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcmhvvqmfrkjjmztgjwgjtnguukgjzvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838679.3368304-3954-98544778554840/AnsiballZ_copy.py
Feb 23 09:24:40 np0005626463.localdomain sudo[212157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:40 np0005626463.localdomain python3.9[212159]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838679.3368304-3954-98544778554840/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:40 np0005626463.localdomain sudo[212157]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:41 np0005626463.localdomain sudo[212267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftkqbheoaqlpgqgrrrvoemyeyxfeekqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838681.0290236-3999-281213604590977/AnsiballZ_stat.py
Feb 23 09:24:41 np0005626463.localdomain sudo[212267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:41 np0005626463.localdomain python3.9[212269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:41 np0005626463.localdomain sudo[212267]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:41 np0005626463.localdomain sudo[212355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zigfdayljmrpygpypjpkczcourzjuykl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838681.0290236-3999-281213604590977/AnsiballZ_copy.py
Feb 23 09:24:41 np0005626463.localdomain sudo[212355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:42 np0005626463.localdomain python3.9[212357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838681.0290236-3999-281213604590977/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:42 np0005626463.localdomain sudo[212355]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52458 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF013470000000001030307) 
Feb 23 09:24:42 np0005626463.localdomain sudo[212465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwqpsufbgxwqtpuxoxxlrhkoolwnumyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838682.2423844-4044-43166436884493/AnsiballZ_stat.py
Feb 23 09:24:42 np0005626463.localdomain sudo[212465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:42 np0005626463.localdomain python3.9[212467]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:24:42 np0005626463.localdomain sudo[212465]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:43 np0005626463.localdomain sudo[212553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiiolmtasnhxcxsijepadkefwbhxttrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838682.2423844-4044-43166436884493/AnsiballZ_copy.py
Feb 23 09:24:43 np0005626463.localdomain sudo[212553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:43 np0005626463.localdomain python3.9[212555]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838682.2423844-4044-43166436884493/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:24:43 np0005626463.localdomain sudo[212553]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:43 np0005626463.localdomain sudo[212663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijwxbllzgkqpfsndxoxcesxituimrhrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838683.4894595-4089-115871355378443/AnsiballZ_systemd.py
Feb 23 09:24:43 np0005626463.localdomain sudo[212663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:44 np0005626463.localdomain python3.9[212665]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:44 np0005626463.localdomain systemd-rc-local-generator[212694]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:44 np0005626463.localdomain systemd-sysv-generator[212697]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:44 np0005626463.localdomain systemd[1]: Reached target edpm_libvirt.target.
Feb 23 09:24:44 np0005626463.localdomain sudo[212663]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:45 np0005626463.localdomain sudo[212814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laacljedfpptjpcswcorpnqggelmhkvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838684.790973-4113-123181284089147/AnsiballZ_systemd.py
Feb 23 09:24:45 np0005626463.localdomain sudo[212814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:24:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8888 DF PROTO=TCP SPT=58522 DPT=9105 SEQ=2047497438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF01E060000000001030307) 
Feb 23 09:24:45 np0005626463.localdomain python3.9[212816]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:45 np0005626463.localdomain systemd-sysv-generator[212845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:45 np0005626463.localdomain systemd-rc-local-generator[212837]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:24:45 np0005626463.localdomain systemd-rc-local-generator[212875]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:24:45 np0005626463.localdomain systemd-sysv-generator[212880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:24:46 np0005626463.localdomain sudo[212814]: pam_unix(sudo:session): session closed for user root
Feb 23 09:24:46 np0005626463.localdomain sshd[163969]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:24:46 np0005626463.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Feb 23 09:24:46 np0005626463.localdomain systemd[1]: session-52.scope: Consumed 3min 28.951s CPU time.
Feb 23 09:24:46 np0005626463.localdomain systemd-logind[759]: Session 52 logged out. Waiting for processes to exit.
Feb 23 09:24:46 np0005626463.localdomain systemd-logind[759]: Removed session 52.
Feb 23 09:24:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:24:46 np0005626463.localdomain podman[212906]: 2026-02-23 09:24:46.923613021 +0000 UTC m=+0.093857648 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:24:46 np0005626463.localdomain podman[212906]: 2026-02-23 09:24:46.968193387 +0000 UTC m=+0.138438034 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:24:46 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:24:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:24:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:24:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:24:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:24:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:24:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:24:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18595 DF PROTO=TCP SPT=56764 DPT=9101 SEQ=2878013049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF02C070000000001030307) 
Feb 23 09:24:48 np0005626463.localdomain sshd[212929]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:24:49 np0005626463.localdomain sshd[212929]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:24:52 np0005626463.localdomain sshd[212931]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:24:52 np0005626463.localdomain sshd[212931]: Accepted publickey for zuul from 192.168.122.30 port 56194 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:24:52 np0005626463.localdomain systemd-logind[759]: New session 53 of user zuul.
Feb 23 09:24:52 np0005626463.localdomain systemd[1]: Started Session 53 of User zuul.
Feb 23 09:24:52 np0005626463.localdomain sshd[212931]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:24:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:24:53 np0005626463.localdomain python3.9[213042]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:24:53 np0005626463.localdomain podman[213043]: 2026-02-23 09:24:53.91850168 +0000 UTC m=+0.084801320 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:24:53 np0005626463.localdomain podman[213043]: 2026-02-23 09:24:53.958336823 +0000 UTC m=+0.124636433 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:24:53 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:24:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46915 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF040D00000000001030307) 
Feb 23 09:24:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52460 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF044070000000001030307) 
Feb 23 09:24:55 np0005626463.localdomain python3.9[213172]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:24:55 np0005626463.localdomain network[213189]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:24:55 np0005626463.localdomain network[213190]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:24:55 np0005626463.localdomain network[213191]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:24:57 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:24:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46917 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF04CC60000000001030307) 
Feb 23 09:25:00 np0005626463.localdomain sudo[213421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzpqlgpytbwqybezdwvdqmblibbnmebl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838699.9408042-96-191887983160784/AnsiballZ_setup.py
Feb 23 09:25:00 np0005626463.localdomain sudo[213421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:00 np0005626463.localdomain python3.9[213423]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:25:00 np0005626463.localdomain sudo[213421]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41264 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF05B460000000001030307) 
Feb 23 09:25:01 np0005626463.localdomain sudo[213484]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkctujyvndxnnemcualtpkdlstxixyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838699.9408042-96-191887983160784/AnsiballZ_dnf.py
Feb 23 09:25:01 np0005626463.localdomain sudo[213484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:01 np0005626463.localdomain python3.9[213486]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:25:02 np0005626463.localdomain sshd[213489]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:25:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41265 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF063460000000001030307) 
Feb 23 09:25:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11836 DF PROTO=TCP SPT=37330 DPT=9101 SEQ=2381435964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF070460000000001030307) 
Feb 23 09:25:07 np0005626463.localdomain sshd[213491]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:25:08 np0005626463.localdomain sshd[213489]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:25:08 np0005626463.localdomain sudo[213484]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:09 np0005626463.localdomain sshd[213491]: Invalid user user from 80.94.95.116 port 55504
Feb 23 09:25:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46919 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF07C060000000001030307) 
Feb 23 09:25:09 np0005626463.localdomain sudo[213600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiriqeiyylmksxjbnvyzhfxhuiksvmlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838708.9984035-132-23317857898708/AnsiballZ_stat.py
Feb 23 09:25:09 np0005626463.localdomain sudo[213600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:09 np0005626463.localdomain python3.9[213602]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:25:09 np0005626463.localdomain sudo[213600]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:09 np0005626463.localdomain sshd[213491]: Connection closed by invalid user user 80.94.95.116 port 55504 [preauth]
Feb 23 09:25:10 np0005626463.localdomain sudo[213712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkbaukryjipioyksalemtpqvcefjvewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838709.935874-156-249260417505651/AnsiballZ_copy.py
Feb 23 09:25:10 np0005626463.localdomain sudo[213712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:10 np0005626463.localdomain python3.9[213714]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:10 np0005626463.localdomain sudo[213712]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:11 np0005626463.localdomain sudo[213822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kywmzeldtjcnpobjqbfjdqxihmhnddqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838711.2125595-180-119920897907287/AnsiballZ_command.py
Feb 23 09:25:11 np0005626463.localdomain sudo[213822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:11 np0005626463.localdomain python3.9[213824]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:11 np0005626463.localdomain sudo[213822]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:12 np0005626463.localdomain sudo[213933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdmvfcomlpsrmvolszcmdijatqoxiqpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838711.9948106-204-169812990113724/AnsiballZ_command.py
Feb 23 09:25:12 np0005626463.localdomain sudo[213933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29761 DF PROTO=TCP SPT=35032 DPT=9882 SEQ=2357979186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF088860000000001030307) 
Feb 23 09:25:12 np0005626463.localdomain python3.9[213935]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:12 np0005626463.localdomain sudo[213933]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:12 np0005626463.localdomain sudo[214044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrpconbfymombhzyxbzzokebachvmvhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838712.639146-228-279910808820654/AnsiballZ_command.py
Feb 23 09:25:12 np0005626463.localdomain sudo[214044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:13 np0005626463.localdomain python3.9[214046]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:13 np0005626463.localdomain sudo[214044]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:14 np0005626463.localdomain sudo[214155]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uevofzjgkehxuyfimgnklqvunbucrcwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838714.0567217-255-237945882655501/AnsiballZ_stat.py
Feb 23 09:25:14 np0005626463.localdomain sudo[214155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:14 np0005626463.localdomain python3.9[214157]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:25:14 np0005626463.localdomain sudo[214155]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:15 np0005626463.localdomain sudo[214267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvxvkwnhvltxsnodsnvjsgowowttwyiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838714.8131177-288-253971772631235/AnsiballZ_lineinfile.py
Feb 23 09:25:15 np0005626463.localdomain sudo[214267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41267 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF094060000000001030307) 
Feb 23 09:25:15 np0005626463.localdomain python3.9[214269]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:15 np0005626463.localdomain sudo[214267]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:16 np0005626463.localdomain sudo[214377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gedvwawaghesoxxpweibecggvsvbldqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838716.2524974-315-226456502206527/AnsiballZ_systemd_service.py
Feb 23 09:25:16 np0005626463.localdomain sudo[214377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:17 np0005626463.localdomain python3.9[214379]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:25:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:25:17 np0005626463.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 23 09:25:17 np0005626463.localdomain sudo[214377]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:17 np0005626463.localdomain podman[214381]: 2026-02-23 09:25:17.306717199 +0000 UTC m=+0.079761052 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 23 09:25:17 np0005626463.localdomain podman[214381]: 2026-02-23 09:25:17.34423734 +0000 UTC m=+0.117281173 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Feb 23 09:25:17 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:25:17 np0005626463.localdomain sudo[214517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjngzovnxvjteznrsudonjhmpolkowsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838717.4654326-339-47140970594232/AnsiballZ_systemd_service.py
Feb 23 09:25:17 np0005626463.localdomain sudo[214517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:18 np0005626463.localdomain python3.9[214519]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:25:18 np0005626463.localdomain systemd-sysv-generator[214551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:25:18 np0005626463.localdomain systemd-rc-local-generator[214546]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11838 DF PROTO=TCP SPT=37330 DPT=9101 SEQ=2381435964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0A0060000000001030307) 
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: Starting Open-iSCSI...
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: If using hardware iscsi like qla4xxx this message can be ignored.
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Feb 23 09:25:18 np0005626463.localdomain iscsid[214560]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: Started Open-iSCSI.
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 23 09:25:18 np0005626463.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 23 09:25:18 np0005626463.localdomain sudo[214517]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:19 np0005626463.localdomain python3.9[214669]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:25:19 np0005626463.localdomain network[214686]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:25:19 np0005626463.localdomain network[214687]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:25:19 np0005626463.localdomain network[214688]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:25:20 np0005626463.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 23 09:25:20 np0005626463.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 23 09:25:20 np0005626463.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Feb 23 09:25:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45
Feb 23 09:25:21 np0005626463.localdomain setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 23 09:25:23 np0005626463.localdomain sudo[214845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:25:23 np0005626463.localdomain sudo[214845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:25:23 np0005626463.localdomain sudo[214845]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:23 np0005626463.localdomain sudo[214863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:25:24 np0005626463.localdomain sudo[214863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:25:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:25:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24904 DF PROTO=TCP SPT=50328 DPT=9102 SEQ=3199228888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0B6000000000001030307) 
Feb 23 09:25:24 np0005626463.localdomain systemd[1]: tmp-crun.h7q49F.mount: Deactivated successfully.
Feb 23 09:25:24 np0005626463.localdomain podman[214881]: 2026-02-23 09:25:24.126925504 +0000 UTC m=+0.105861044 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:25:24 np0005626463.localdomain podman[214881]: 2026-02-23 09:25:24.137309851 +0000 UTC m=+0.116245351 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:25:24 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:25:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29763 DF PROTO=TCP SPT=35032 DPT=9882 SEQ=2357979186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0B8070000000001030307) 
Feb 23 09:25:24 np0005626463.localdomain sudo[214863]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:24 np0005626463.localdomain sudo[215021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrbxfsszhpvhraxgrmbeibywajwgrmia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838724.7331715-408-187988818792944/AnsiballZ_dnf.py
Feb 23 09:25:24 np0005626463.localdomain sudo[215021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:25 np0005626463.localdomain sudo[215024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:25:25 np0005626463.localdomain sudo[215024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:25:25 np0005626463.localdomain sudo[215024]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:25 np0005626463.localdomain python3.9[215023]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:25:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24906 DF PROTO=TCP SPT=50328 DPT=9102 SEQ=3199228888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0C2060000000001030307) 
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:25:29 np0005626463.localdomain sshd[215058]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:25:29 np0005626463.localdomain systemd-rc-local-generator[215084]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:25:29 np0005626463.localdomain systemd-sysv-generator[215089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:25:29 np0005626463.localdomain sshd[215058]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: run-r637ba52ec0f147179484851958911c07.service: Deactivated successfully.
Feb 23 09:25:29 np0005626463.localdomain systemd[1]: run-r4d04a739c1c1418bb00674c5d773d907.service: Deactivated successfully.
Feb 23 09:25:30 np0005626463.localdomain sudo[215021]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27134 DF PROTO=TCP SPT=41448 DPT=9100 SEQ=1097254809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0D0860000000001030307) 
Feb 23 09:25:31 np0005626463.localdomain sudo[215331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tapscdnqjmyhmqjbgknsnyofgdzfbsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838730.8033051-435-279320660084286/AnsiballZ_file.py
Feb 23 09:25:31 np0005626463.localdomain sudo[215331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:31 np0005626463.localdomain python3.9[215333]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 09:25:31 np0005626463.localdomain sudo[215331]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:31 np0005626463.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Feb 23 09:25:31 np0005626463.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 23 09:25:32 np0005626463.localdomain sudo[215441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xugokzaqjbgryeuztfqrfpzkahjbixul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838731.615533-459-20272608647089/AnsiballZ_modprobe.py
Feb 23 09:25:32 np0005626463.localdomain sudo[215441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:32 np0005626463.localdomain python3.9[215443]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 23 09:25:32 np0005626463.localdomain sudo[215441]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:32 np0005626463.localdomain sudo[215555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dccohmqdcbczslcndvgbwtiolgwekzqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838732.435389-483-233868046033386/AnsiballZ_stat.py
Feb 23 09:25:32 np0005626463.localdomain sudo[215555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:32 np0005626463.localdomain python3.9[215557]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:25:32 np0005626463.localdomain sudo[215555]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27135 DF PROTO=TCP SPT=41448 DPT=9100 SEQ=1097254809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0D8860000000001030307) 
Feb 23 09:25:33 np0005626463.localdomain sudo[215643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jigbntdapdguqcwqviygcwsbzvwoqlcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838732.435389-483-233868046033386/AnsiballZ_copy.py
Feb 23 09:25:33 np0005626463.localdomain sudo[215643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:33 np0005626463.localdomain python3.9[215645]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838732.435389-483-233868046033386/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:33 np0005626463.localdomain sudo[215643]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:33 np0005626463.localdomain sudo[215753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nftvrjoherwuhfcecitpkwjjcxoqfccf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838733.6897726-531-163293722629716/AnsiballZ_lineinfile.py
Feb 23 09:25:33 np0005626463.localdomain sudo[215753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:34 np0005626463.localdomain python3.9[215755]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:34 np0005626463.localdomain sudo[215753]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:34 np0005626463.localdomain sudo[215863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjgncllkrfwbmvkixltvhacdqcdqeocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838734.3437064-555-78937798016962/AnsiballZ_systemd.py
Feb 23 09:25:34 np0005626463.localdomain sudo[215863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:35 np0005626463.localdomain python3.9[215865]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:25:35 np0005626463.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 09:25:35 np0005626463.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 23 09:25:35 np0005626463.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 23 09:25:35 np0005626463.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 23 09:25:35 np0005626463.localdomain systemd-modules-load[215869]: Module 'msr' is built in
Feb 23 09:25:35 np0005626463.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 23 09:25:35 np0005626463.localdomain sudo[215863]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:35 np0005626463.localdomain sudo[215977]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-faeughrmkgeoodbtpqdmxvtfptsqfreg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838735.5135427-579-114791696615311/AnsiballZ_command.py
Feb 23 09:25:35 np0005626463.localdomain sudo[215977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:36 np0005626463.localdomain python3.9[215979]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:36 np0005626463.localdomain sudo[215977]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36623 DF PROTO=TCP SPT=52888 DPT=9101 SEQ=1907568863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0E5460000000001030307) 
Feb 23 09:25:36 np0005626463.localdomain sudo[216088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgaypcocyhjbaejoujmcgbtwmrowlujy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838736.3772938-609-126783379545032/AnsiballZ_stat.py
Feb 23 09:25:36 np0005626463.localdomain sudo[216088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:36 np0005626463.localdomain python3.9[216090]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:25:36 np0005626463.localdomain sudo[216088]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:37 np0005626463.localdomain sudo[216198]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmrkdnvuwgygoizxhlqnirolxmunqnxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838737.1081872-636-103485383052129/AnsiballZ_stat.py
Feb 23 09:25:37 np0005626463.localdomain sudo[216198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:37 np0005626463.localdomain python3.9[216200]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:25:37 np0005626463.localdomain sudo[216198]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:37 np0005626463.localdomain sudo[216286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixepvddbjdiefsnkzxotujlbygjnpjap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838737.1081872-636-103485383052129/AnsiballZ_copy.py
Feb 23 09:25:37 np0005626463.localdomain sudo[216286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:38 np0005626463.localdomain python3.9[216288]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838737.1081872-636-103485383052129/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:38 np0005626463.localdomain sudo[216286]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62953 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0F1B30000000001030307) 
Feb 23 09:25:39 np0005626463.localdomain sudo[216396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onggurwftvfubmftqweqzvqejjmkmxcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838739.1556747-681-85238291150774/AnsiballZ_command.py
Feb 23 09:25:39 np0005626463.localdomain sudo[216396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:39 np0005626463.localdomain python3.9[216398]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:39 np0005626463.localdomain sudo[216396]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:40 np0005626463.localdomain sudo[216507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yctzcraamjtqfhhqoqgkqyenkzccmpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838739.8127882-705-141949496238004/AnsiballZ_lineinfile.py
Feb 23 09:25:40 np0005626463.localdomain sudo[216507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:40 np0005626463.localdomain python3.9[216509]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:40 np0005626463.localdomain sudo[216507]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:40 np0005626463.localdomain sshd[216527]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:25:41 np0005626463.localdomain sudo[216619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pudmrpsfrukjlkjbdacxfpdrnzycxumu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838741.2652142-729-212514674147015/AnsiballZ_replace.py
Feb 23 09:25:41 np0005626463.localdomain sudo[216619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:41 np0005626463.localdomain python3.9[216621]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:41 np0005626463.localdomain sudo[216619]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:42 np0005626463.localdomain sudo[216729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltkaqpeuycznjmmaznxkmkwkgbsvhwlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838742.1652331-753-73011482757499/AnsiballZ_replace.py
Feb 23 09:25:42 np0005626463.localdomain sudo[216729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62955 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0FDC60000000001030307) 
Feb 23 09:25:42 np0005626463.localdomain python3.9[216731]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:42 np0005626463.localdomain sudo[216729]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:43 np0005626463.localdomain sudo[216839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnzystwrdufevfglcuofwntozajecsnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838742.893962-780-66637815131079/AnsiballZ_lineinfile.py
Feb 23 09:25:43 np0005626463.localdomain sudo[216839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:43 np0005626463.localdomain python3.9[216841]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:43 np0005626463.localdomain sudo[216839]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:43 np0005626463.localdomain sudo[216949]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqtwboeyjmrsztaiatbppdjzoylpkopy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838743.475859-780-266490082772331/AnsiballZ_lineinfile.py
Feb 23 09:25:43 np0005626463.localdomain sudo[216949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:43 np0005626463.localdomain python3.9[216951]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:43 np0005626463.localdomain sudo[216949]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:44 np0005626463.localdomain sudo[217059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qezjvfynurjpdnzpcrqnpthxgdkffaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838744.0703251-780-41137594368524/AnsiballZ_lineinfile.py
Feb 23 09:25:44 np0005626463.localdomain sudo[217059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:44 np0005626463.localdomain python3.9[217061]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:44 np0005626463.localdomain sudo[217059]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:44 np0005626463.localdomain sudo[217169]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhflrhucjircsoseqfllkichpoooxwyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838744.681526-780-102002291493291/AnsiballZ_lineinfile.py
Feb 23 09:25:44 np0005626463.localdomain sudo[217169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18967 DF PROTO=TCP SPT=46600 DPT=9105 SEQ=3736887681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF108060000000001030307) 
Feb 23 09:25:45 np0005626463.localdomain python3.9[217171]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:45 np0005626463.localdomain sudo[217169]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:45 np0005626463.localdomain sudo[217279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvcbaerbnjgaobdewqqepluhwhymjifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838745.3716989-867-245037721201155/AnsiballZ_stat.py
Feb 23 09:25:45 np0005626463.localdomain sudo[217279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:45 np0005626463.localdomain python3.9[217281]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:25:45 np0005626463.localdomain sudo[217279]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:45 np0005626463.localdomain sshd[216527]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:25:46 np0005626463.localdomain sudo[217391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byyvqkxfcghqzsajvhjgvhgvulyisnhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838746.0657108-891-240419426555363/AnsiballZ_command.py
Feb 23 09:25:46 np0005626463.localdomain sudo[217391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:46 np0005626463.localdomain python3.9[217393]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:25:46 np0005626463.localdomain sudo[217391]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:47 np0005626463.localdomain sudo[217502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnsjccpkqbbyouhrkjcqkqteldjvljut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838746.8656611-918-113879615386897/AnsiballZ_systemd_service.py
Feb 23 09:25:47 np0005626463.localdomain sudo[217502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:47 np0005626463.localdomain python3.9[217504]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:25:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:25:47 np0005626463.localdomain systemd[1]: Listening on multipathd control socket.
Feb 23 09:25:47 np0005626463.localdomain sudo[217502]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:47 np0005626463.localdomain systemd[1]: tmp-crun.eEaAPC.mount: Deactivated successfully.
Feb 23 09:25:47 np0005626463.localdomain podman[217506]: 2026-02-23 09:25:47.543099143 +0000 UTC m=+0.080569687 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 23 09:25:47 np0005626463.localdomain podman[217506]: 2026-02-23 09:25:47.656253415 +0000 UTC m=+0.193723939 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 09:25:47 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:25:48 np0005626463.localdomain sudo[217640]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiexmngoksdbpkeysskdvkpddvcafxuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838747.7457106-942-1957922249540/AnsiballZ_systemd_service.py
Feb 23 09:25:48 np0005626463.localdomain sudo[217640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:48 np0005626463.localdomain python3.9[217642]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:25:48 np0005626463.localdomain systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 23 09:25:48 np0005626463.localdomain udevadm[217647]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 23 09:25:48 np0005626463.localdomain systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 23 09:25:48 np0005626463.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 23 09:25:48 np0005626463.localdomain multipathd[217650]: --------start up--------
Feb 23 09:25:48 np0005626463.localdomain multipathd[217650]: read /etc/multipath.conf
Feb 23 09:25:48 np0005626463.localdomain multipathd[217650]: path checkers start up
Feb 23 09:25:48 np0005626463.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 23 09:25:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:25:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:25:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:25:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:25:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:25:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:25:48 np0005626463.localdomain sudo[217640]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36625 DF PROTO=TCP SPT=52888 DPT=9101 SEQ=1907568863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF116060000000001030307) 
Feb 23 09:25:49 np0005626463.localdomain sudo[217766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rducelzjuarexukluxroxplarhqhnjhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838749.0663986-978-265389780704871/AnsiballZ_file.py
Feb 23 09:25:49 np0005626463.localdomain sudo[217766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:49 np0005626463.localdomain python3.9[217768]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 09:25:49 np0005626463.localdomain sudo[217766]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:49 np0005626463.localdomain sudo[217876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfxoalkdpwbgymkjnlvtlbhqsnwrlabf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838749.741598-1002-117124111433128/AnsiballZ_modprobe.py
Feb 23 09:25:49 np0005626463.localdomain sudo[217876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:50 np0005626463.localdomain python3.9[217878]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 23 09:25:50 np0005626463.localdomain sudo[217876]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:50 np0005626463.localdomain sudo[217995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joezyxuqdtudwmrkarhzdeidabmdorej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838750.4665916-1026-127961187653858/AnsiballZ_stat.py
Feb 23 09:25:50 np0005626463.localdomain sudo[217995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:50 np0005626463.localdomain python3.9[217997]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:25:50 np0005626463.localdomain sudo[217995]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:51 np0005626463.localdomain sudo[218083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whagmdzxgphyblvcivaesplzvxneypuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838750.4665916-1026-127961187653858/AnsiballZ_copy.py
Feb 23 09:25:51 np0005626463.localdomain sudo[218083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:51 np0005626463.localdomain python3.9[218085]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838750.4665916-1026-127961187653858/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:51 np0005626463.localdomain sudo[218083]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:53 np0005626463.localdomain sudo[218193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eliynfaifhwnlxhkgayngkowjantvyxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838752.7702715-1074-278933550785695/AnsiballZ_lineinfile.py
Feb 23 09:25:53 np0005626463.localdomain sudo[218193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:53 np0005626463.localdomain python3.9[218195]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:25:53 np0005626463.localdomain sudo[218193]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:53 np0005626463.localdomain sudo[218303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evjjrhvyvilrezxzjemzyqizotnqhpwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838753.4306624-1098-94093110992228/AnsiballZ_systemd.py
Feb 23 09:25:53 np0005626463.localdomain sudo[218303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:54 np0005626463.localdomain python3.9[218305]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 23 09:25:54 np0005626463.localdomain systemd-modules-load[218309]: Module 'msr' is built in
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 23 09:25:54 np0005626463.localdomain sudo[218303]: pam_unix(sudo:session): session closed for user root
Feb 23 09:25:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4965 DF PROTO=TCP SPT=42684 DPT=9102 SEQ=3732269252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF12B6A0000000001030307) 
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:25:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62957 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF12E060000000001030307) 
Feb 23 09:25:54 np0005626463.localdomain podman[218327]: 2026-02-23 09:25:54.916236253 +0000 UTC m=+0.090919733 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:25:54 np0005626463.localdomain podman[218327]: 2026-02-23 09:25:54.926272579 +0000 UTC m=+0.100956069 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:25:54 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:25:55 np0005626463.localdomain sudo[218434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skxclzszkzhamoltcufgfakcdsmdbqhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838754.852926-1122-99597005491571/AnsiballZ_dnf.py
Feb 23 09:25:55 np0005626463.localdomain sudo[218434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:25:55 np0005626463.localdomain python3.9[218436]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:25:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4967 DF PROTO=TCP SPT=42684 DPT=9102 SEQ=3732269252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF137860000000001030307) 
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:25:59 np0005626463.localdomain systemd-rc-local-generator[218469]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:25:59 np0005626463.localdomain systemd-sysv-generator[218475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:25:59 np0005626463.localdomain systemd-rc-local-generator[218505]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:25:59 np0005626463.localdomain systemd-sysv-generator[218508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:25:59 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 23 09:26:00 np0005626463.localdomain systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 23 09:26:00 np0005626463.localdomain lvm[218557]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 09:26:00 np0005626463.localdomain lvm[218557]: VG ceph_vg0 finished
Feb 23 09:26:00 np0005626463.localdomain lvm[218558]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 09:26:00 np0005626463.localdomain lvm[218558]: VG ceph_vg1 finished
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:26:00 np0005626463.localdomain systemd-rc-local-generator[218610]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:26:00 np0005626463.localdomain systemd-sysv-generator[218613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:00 np0005626463.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 23 09:26:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60195 DF PROTO=TCP SPT=50476 DPT=9100 SEQ=4137352120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF145C60000000001030307) 
Feb 23 09:26:01 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 23 09:26:01 np0005626463.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 23 09:26:01 np0005626463.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.320s CPU time.
Feb 23 09:26:01 np0005626463.localdomain systemd[1]: run-rf9480d67471c41c2ad5bf3fba9700aef.service: Deactivated successfully.
Feb 23 09:26:01 np0005626463.localdomain sudo[218434]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:02 np0005626463.localdomain sudo[219866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cakirtgosmkgqkdictctihapfybndfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838762.6255066-1146-271765630086687/AnsiballZ_systemd_service.py
Feb 23 09:26:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60196 DF PROTO=TCP SPT=50476 DPT=9100 SEQ=4137352120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF14DC60000000001030307) 
Feb 23 09:26:02 np0005626463.localdomain sudo[219866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:03 np0005626463.localdomain python3.9[219868]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:26:03 np0005626463.localdomain systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 23 09:26:03 np0005626463.localdomain multipathd[217650]: exit (signal)
Feb 23 09:26:03 np0005626463.localdomain multipathd[217650]: --------shut down-------
Feb 23 09:26:03 np0005626463.localdomain systemd[1]: multipathd.service: Deactivated successfully.
Feb 23 09:26:03 np0005626463.localdomain systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 23 09:26:03 np0005626463.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 23 09:26:03 np0005626463.localdomain multipathd[219874]: --------start up--------
Feb 23 09:26:03 np0005626463.localdomain multipathd[219874]: read /etc/multipath.conf
Feb 23 09:26:03 np0005626463.localdomain multipathd[219874]: path checkers start up
Feb 23 09:26:03 np0005626463.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 23 09:26:03 np0005626463.localdomain sudo[219866]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:04 np0005626463.localdomain python3.9[219990]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:26:05 np0005626463.localdomain sudo[220102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkzfaakzmdhcjhoselpkspvuadaxmace ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838765.3274229-1198-200245024621628/AnsiballZ_file.py
Feb 23 09:26:05 np0005626463.localdomain sudo[220102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:05 np0005626463.localdomain python3.9[220104]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:05 np0005626463.localdomain sudo[220102]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7311 DF PROTO=TCP SPT=57178 DPT=9101 SEQ=1163680637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF15A860000000001030307) 
Feb 23 09:26:06 np0005626463.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 23 09:26:07 np0005626463.localdomain sudo[220213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pepdvsctpgywreyocyfmkyicomqvjqdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838766.2891674-1231-27875424820032/AnsiballZ_systemd_service.py
Feb 23 09:26:07 np0005626463.localdomain sudo[220213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:07 np0005626463.localdomain python3.9[220215]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:26:07 np0005626463.localdomain systemd-rc-local-generator[220240]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:26:07 np0005626463.localdomain systemd-sysv-generator[220244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:07 np0005626463.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 23 09:26:07 np0005626463.localdomain sudo[220213]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:08 np0005626463.localdomain python3.9[220360]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:26:08 np0005626463.localdomain network[220377]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:26:08 np0005626463.localdomain network[220378]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:26:08 np0005626463.localdomain network[220379]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:26:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60317 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF166E30000000001030307) 
Feb 23 09:26:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:26:11 np0005626463.localdomain sshd[220496]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:26:11 np0005626463.localdomain sshd[220496]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:26:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60319 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF173060000000001030307) 
Feb 23 09:26:12 np0005626463.localdomain sudo[220612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycyofzswtumkeivinrdvbsccjrnmtuez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838772.4823494-1288-203617483233135/AnsiballZ_systemd_service.py
Feb 23 09:26:12 np0005626463.localdomain sudo[220612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:13 np0005626463.localdomain python3.9[220614]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:13 np0005626463.localdomain sudo[220612]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:13 np0005626463.localdomain sudo[220723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjmjxyfairnpvwycbtrkrctrjcatklbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838773.270633-1288-120898506565574/AnsiballZ_systemd_service.py
Feb 23 09:26:13 np0005626463.localdomain sudo[220723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:13 np0005626463.localdomain python3.9[220725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:13 np0005626463.localdomain sudo[220723]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:14 np0005626463.localdomain sudo[220834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geyytzbvrmmtqmeeibpvejsfhylzjyxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838774.0756044-1288-152887981730107/AnsiballZ_systemd_service.py
Feb 23 09:26:14 np0005626463.localdomain sudo[220834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:14 np0005626463.localdomain python3.9[220836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:14 np0005626463.localdomain sudo[220834]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:15 np0005626463.localdomain sudo[220945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugceiohtnjopozearnsbtvcjyykwqgfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838774.7841682-1288-90816496219775/AnsiballZ_systemd_service.py
Feb 23 09:26:15 np0005626463.localdomain sudo[220945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65463 DF PROTO=TCP SPT=38604 DPT=9105 SEQ=108143549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF17E060000000001030307) 
Feb 23 09:26:15 np0005626463.localdomain python3.9[220947]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:15 np0005626463.localdomain sudo[220945]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:15 np0005626463.localdomain sudo[221056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pltgjewwlydhbtvooguqgtdcsxlhbygk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838775.4602962-1288-120914158888816/AnsiballZ_systemd_service.py
Feb 23 09:26:15 np0005626463.localdomain sudo[221056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:16 np0005626463.localdomain python3.9[221058]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:16 np0005626463.localdomain sudo[221056]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:16 np0005626463.localdomain sudo[221167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlnqlwfgqzzcneiudzcpgttsfhulztlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838776.3663745-1288-226821552447645/AnsiballZ_systemd_service.py
Feb 23 09:26:16 np0005626463.localdomain sudo[221167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:17 np0005626463.localdomain python3.9[221169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:17 np0005626463.localdomain sudo[221167]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:17 np0005626463.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 23 09:26:17 np0005626463.localdomain sudo[221279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwzheoqrgoqgcbwkjdbbshaubfzaapfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838777.4157188-1288-204117274383279/AnsiballZ_systemd_service.py
Feb 23 09:26:17 np0005626463.localdomain sudo[221279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:26:17 np0005626463.localdomain systemd[1]: tmp-crun.87wvyv.mount: Deactivated successfully.
Feb 23 09:26:17 np0005626463.localdomain podman[221282]: 2026-02-23 09:26:17.927948752 +0000 UTC m=+0.094215601 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:26:17 np0005626463.localdomain python3.9[221281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:17 np0005626463.localdomain sudo[221279]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:18 np0005626463.localdomain podman[221282]: 2026-02-23 09:26:18.006455876 +0000 UTC m=+0.172722745 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:26:18 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:26:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7313 DF PROTO=TCP SPT=57178 DPT=9101 SEQ=1163680637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF18A060000000001030307) 
Feb 23 09:26:19 np0005626463.localdomain sudo[221418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwrcrfzssotuiynpwkjtqwourlgldiyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838778.1032999-1288-236558116268213/AnsiballZ_systemd_service.py
Feb 23 09:26:19 np0005626463.localdomain sudo[221418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:19 np0005626463.localdomain python3.9[221420]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:26:19 np0005626463.localdomain sudo[221418]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:21 np0005626463.localdomain sshd[221439]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:26:22 np0005626463.localdomain sudo[221531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xymquagppbrnjckjnssdvmbrjdwetbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838782.083699-1465-248992645409933/AnsiballZ_file.py
Feb 23 09:26:22 np0005626463.localdomain sudo[221531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:22 np0005626463.localdomain python3.9[221533]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:22 np0005626463.localdomain sudo[221531]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:23 np0005626463.localdomain sudo[221641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbimuhkenpczxlqsdubbetbftwtugqrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838782.688924-1465-12186539344207/AnsiballZ_file.py
Feb 23 09:26:23 np0005626463.localdomain sudo[221641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:23 np0005626463.localdomain python3.9[221643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:23 np0005626463.localdomain sudo[221641]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:23 np0005626463.localdomain sudo[221751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbtqswmnjkklaekixckxidjhnljfpzaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838783.3355098-1465-194605165103149/AnsiballZ_file.py
Feb 23 09:26:23 np0005626463.localdomain sudo[221751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:23 np0005626463.localdomain python3.9[221753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:23 np0005626463.localdomain sudo[221751]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=713 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1A0600000000001030307) 
Feb 23 09:26:24 np0005626463.localdomain sudo[221861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whbdhjcbdweuysbdkgpwlpzyohlxurnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838783.9482286-1465-34100098929575/AnsiballZ_file.py
Feb 23 09:26:24 np0005626463.localdomain sudo[221861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:24 np0005626463.localdomain python3.9[221863]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:24 np0005626463.localdomain sudo[221861]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:24 np0005626463.localdomain sudo[221971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhjrsgiomnxpezbjqtsznirdriowindj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838784.5230174-1465-135219351758605/AnsiballZ_file.py
Feb 23 09:26:24 np0005626463.localdomain sudo[221971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:24 np0005626463.localdomain python3.9[221973]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:24 np0005626463.localdomain sudo[221971]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60321 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1A4070000000001030307) 
Feb 23 09:26:25 np0005626463.localdomain sshd[221439]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:26:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:26:25 np0005626463.localdomain sudo[222081]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uldhuixbahzkcyxvukrxxtmllfzgtxjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838785.105351-1465-10417463470255/AnsiballZ_file.py
Feb 23 09:26:25 np0005626463.localdomain sudo[222081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:25 np0005626463.localdomain sudo[222092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:26:25 np0005626463.localdomain sudo[222092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:26:25 np0005626463.localdomain sudo[222092]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:25 np0005626463.localdomain podman[222083]: 2026-02-23 09:26:25.444274982 +0000 UTC m=+0.089731419 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:26:25 np0005626463.localdomain podman[222083]: 2026-02-23 09:26:25.48119374 +0000 UTC m=+0.126650177 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 23 09:26:25 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:26:25 np0005626463.localdomain sudo[222119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:26:25 np0005626463.localdomain sudo[222119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:26:25 np0005626463.localdomain python3.9[222089]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:25 np0005626463.localdomain sudo[222081]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:25 np0005626463.localdomain sudo[222258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgnksihtthnhgvgjgujzpcdljqerzbmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838785.7032766-1465-173908207940914/AnsiballZ_file.py
Feb 23 09:26:25 np0005626463.localdomain sudo[222258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:26 np0005626463.localdomain python3.9[222260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:26 np0005626463.localdomain sudo[222258]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:26 np0005626463.localdomain sudo[222119]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:26 np0005626463.localdomain sudo[222365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:26:26 np0005626463.localdomain sudo[222365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:26:26 np0005626463.localdomain sudo[222365]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:26 np0005626463.localdomain sudo[222402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ircxrjmxcfujuecmgsdlkhxxtadckapu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838786.3555653-1465-186577881300947/AnsiballZ_file.py
Feb 23 09:26:26 np0005626463.localdomain sudo[222402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:27 np0005626463.localdomain python3.9[222405]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:27 np0005626463.localdomain sudo[222402]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=715 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1AC860000000001030307) 
Feb 23 09:26:27 np0005626463.localdomain sudo[222513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvoduqqeoufcdwjuervbbggkjoujthtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838787.1815631-1636-229864923294121/AnsiballZ_file.py
Feb 23 09:26:27 np0005626463.localdomain sudo[222513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:27 np0005626463.localdomain python3.9[222515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:27 np0005626463.localdomain sudo[222513]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:28 np0005626463.localdomain sudo[222623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tekrncwsjfgqopszmwmgjaydcbslieqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838788.218399-1636-197977396762368/AnsiballZ_file.py
Feb 23 09:26:28 np0005626463.localdomain sudo[222623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:28 np0005626463.localdomain python3.9[222625]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:28 np0005626463.localdomain sudo[222623]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:29 np0005626463.localdomain sudo[222733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xunnkkskcvhbgfhoxabautbyqnpfszzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838788.8390512-1636-57305514262021/AnsiballZ_file.py
Feb 23 09:26:29 np0005626463.localdomain sudo[222733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:29 np0005626463.localdomain python3.9[222735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:29 np0005626463.localdomain sudo[222733]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:30 np0005626463.localdomain sudo[222843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izwsbvcvwzllvqshaaazdqrmzitpdnqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838789.448471-1636-28701619766675/AnsiballZ_file.py
Feb 23 09:26:30 np0005626463.localdomain sudo[222843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:30 np0005626463.localdomain python3.9[222845]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:30 np0005626463.localdomain sudo[222843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14552 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1BAC60000000001030307) 
Feb 23 09:26:30 np0005626463.localdomain sudo[222953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idhpstwxhaicxprftzsgwgvnlmrzvbma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838790.6932213-1636-124990381179267/AnsiballZ_file.py
Feb 23 09:26:30 np0005626463.localdomain sudo[222953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:31 np0005626463.localdomain python3.9[222955]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:31 np0005626463.localdomain sudo[222953]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:31 np0005626463.localdomain sudo[223063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxwvjbgrmmvyccmewyobfagdclbzlxcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838791.2708058-1636-42590437712907/AnsiballZ_file.py
Feb 23 09:26:31 np0005626463.localdomain sudo[223063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:31 np0005626463.localdomain python3.9[223065]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:31 np0005626463.localdomain sudo[223063]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:32 np0005626463.localdomain sudo[223173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcpcvtqbyvlhowbqykybjywjfedrlvwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838791.8452516-1636-69714446398154/AnsiballZ_file.py
Feb 23 09:26:32 np0005626463.localdomain sudo[223173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:32 np0005626463.localdomain python3.9[223175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:32 np0005626463.localdomain sudo[223173]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:32 np0005626463.localdomain sudo[223283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjpqzaamyaonzovsweghcbypncaebvww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838792.4621902-1636-141858836927157/AnsiballZ_file.py
Feb 23 09:26:32 np0005626463.localdomain sudo[223283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14553 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1C2C60000000001030307) 
Feb 23 09:26:33 np0005626463.localdomain python3.9[223285]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:26:33 np0005626463.localdomain sudo[223283]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:33 np0005626463.localdomain sudo[223393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enubevjgnkapfloxvbeujlcjdogsvldj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838793.5013366-1810-57826498119694/AnsiballZ_command.py
Feb 23 09:26:33 np0005626463.localdomain sudo[223393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:33 np0005626463.localdomain python3.9[223395]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:34 np0005626463.localdomain sudo[223393]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:34 np0005626463.localdomain python3.9[223505]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:26:35 np0005626463.localdomain sudo[223613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcpxjxklxivcfsutjaawgpusnjwsaeew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838795.0885627-1864-225821545123945/AnsiballZ_systemd_service.py
Feb 23 09:26:35 np0005626463.localdomain sudo[223613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:35 np0005626463.localdomain python3.9[223615]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:26:35 np0005626463.localdomain systemd-rc-local-generator[223641]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:26:35 np0005626463.localdomain systemd-sysv-generator[223646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:26:35 np0005626463.localdomain sudo[223613]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64713 DF PROTO=TCP SPT=51780 DPT=9101 SEQ=1252174237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1CFC60000000001030307) 
Feb 23 09:26:36 np0005626463.localdomain sudo[223759]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hztwlpvfrxkqmufgeyoebeugzeyrcplt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838796.237592-1888-137141263852329/AnsiballZ_command.py
Feb 23 09:26:36 np0005626463.localdomain sudo[223759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:36 np0005626463.localdomain python3.9[223761]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:36 np0005626463.localdomain sudo[223759]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:37 np0005626463.localdomain sudo[223870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgprjznxqlfjokjtmlacjhxbfaaxpano ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838796.867126-1888-115580007033102/AnsiballZ_command.py
Feb 23 09:26:37 np0005626463.localdomain sudo[223870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:37 np0005626463.localdomain python3.9[223872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:38 np0005626463.localdomain sudo[223870]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:39 np0005626463.localdomain sudo[223981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpmrwbtjfxfvkvnjpmqapnubnkhbuluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838798.9422367-1888-217529659486967/AnsiballZ_command.py
Feb 23 09:26:39 np0005626463.localdomain sudo[223981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=717 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1DC060000000001030307) 
Feb 23 09:26:39 np0005626463.localdomain python3.9[223983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:39 np0005626463.localdomain sudo[223981]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:39 np0005626463.localdomain sudo[224092]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tffgrbxurfjjezpuqauphqxgzzvchjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838799.5780728-1888-245767586213406/AnsiballZ_command.py
Feb 23 09:26:39 np0005626463.localdomain sudo[224092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:40 np0005626463.localdomain python3.9[224094]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:40 np0005626463.localdomain sudo[224092]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:41 np0005626463.localdomain sudo[224203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzvocbepynfscprwolfyadenuxychclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838800.1749156-1888-124093914224856/AnsiballZ_command.py
Feb 23 09:26:41 np0005626463.localdomain sudo[224203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:41 np0005626463.localdomain python3.9[224205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:41 np0005626463.localdomain sudo[224203]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:41 np0005626463.localdomain sudo[224314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tappsxfsbhdwutgiyrvynepcxglvwqck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838801.348983-1888-164925974162601/AnsiballZ_command.py
Feb 23 09:26:41 np0005626463.localdomain sudo[224314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:41 np0005626463.localdomain python3.9[224316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:41 np0005626463.localdomain sudo[224314]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:42 np0005626463.localdomain sudo[224425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkuybwtpjfyjxvgbgplnazykihjcdzex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838801.972856-1888-136542084833951/AnsiballZ_command.py
Feb 23 09:26:42 np0005626463.localdomain sudo[224425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7105 DF PROTO=TCP SPT=52182 DPT=9882 SEQ=3071906528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1E8070000000001030307) 
Feb 23 09:26:42 np0005626463.localdomain python3.9[224427]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:42 np0005626463.localdomain sudo[224425]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:42 np0005626463.localdomain sudo[224536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkdgyhdgngrpqyoqkasndwttpjvcezue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838802.5886955-1888-252795505314298/AnsiballZ_command.py
Feb 23 09:26:42 np0005626463.localdomain sudo[224536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:43 np0005626463.localdomain python3.9[224538]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:26:43 np0005626463.localdomain sudo[224536]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14555 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1F2060000000001030307) 
Feb 23 09:26:46 np0005626463.localdomain sudo[224647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gscdwrvfdbmlpnajogzidltbtxuahqft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838806.489016-2095-39002795609908/AnsiballZ_file.py
Feb 23 09:26:46 np0005626463.localdomain sudo[224647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:46 np0005626463.localdomain python3.9[224649]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:47 np0005626463.localdomain sudo[224647]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:47 np0005626463.localdomain sudo[224757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqypzrzmojpxaemkyorargbjdiptvchy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838807.142952-2095-223109909897871/AnsiballZ_file.py
Feb 23 09:26:47 np0005626463.localdomain sudo[224757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:47 np0005626463.localdomain python3.9[224759]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:47 np0005626463.localdomain sudo[224757]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:48 np0005626463.localdomain sudo[224867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxrpkxhmjdpcpvojpxakupfmtdtbpzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838807.8276293-2140-196783264964901/AnsiballZ_file.py
Feb 23 09:26:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:26:48 np0005626463.localdomain sudo[224867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:48 np0005626463.localdomain systemd[1]: tmp-crun.Xe47G1.mount: Deactivated successfully.
Feb 23 09:26:48 np0005626463.localdomain podman[224869]: 2026-02-23 09:26:48.230532726 +0000 UTC m=+0.106095976 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Feb 23 09:26:48 np0005626463.localdomain podman[224869]: 2026-02-23 09:26:48.30733741 +0000 UTC m=+0.182900640 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 23 09:26:48 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:26:48 np0005626463.localdomain python3.9[224870]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:48 np0005626463.localdomain sudo[224867]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:26:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:26:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:26:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:26:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:26:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:26:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64715 DF PROTO=TCP SPT=51780 DPT=9101 SEQ=1252174237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF200060000000001030307) 
Feb 23 09:26:48 np0005626463.localdomain sudo[225001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chiyastwifjxyplpaxbqyglbeopcsrdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838808.466085-2140-143138661046909/AnsiballZ_file.py
Feb 23 09:26:48 np0005626463.localdomain sudo[225001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:48 np0005626463.localdomain python3.9[225003]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:48 np0005626463.localdomain sudo[225001]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:49 np0005626463.localdomain sudo[225111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqhfuiarpblgeerohiclpizsgpcqqydj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838809.098897-2140-250216652474874/AnsiballZ_file.py
Feb 23 09:26:49 np0005626463.localdomain sudo[225111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:49 np0005626463.localdomain python3.9[225113]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:49 np0005626463.localdomain sudo[225111]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:50 np0005626463.localdomain sudo[225221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beavpnbhmfcjjoexzhxwaglpplhxfpka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838810.3333724-2140-65131337918065/AnsiballZ_file.py
Feb 23 09:26:50 np0005626463.localdomain sudo[225221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:50 np0005626463.localdomain python3.9[225223]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:50 np0005626463.localdomain sudo[225221]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:51 np0005626463.localdomain sudo[225331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhqmxipsvtiywsjsvzvtotcnzcfpbaiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838810.9660702-2140-41790697690972/AnsiballZ_file.py
Feb 23 09:26:51 np0005626463.localdomain sudo[225331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:51 np0005626463.localdomain python3.9[225333]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:51 np0005626463.localdomain sudo[225331]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:52 np0005626463.localdomain sshd[225389]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:26:52 np0005626463.localdomain sshd[225389]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:26:52 np0005626463.localdomain sudo[225443]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chczmiwjzcbarjnnjoawqcjxhoezsdyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838811.5657465-2140-266522219528083/AnsiballZ_file.py
Feb 23 09:26:52 np0005626463.localdomain sudo[225443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:52 np0005626463.localdomain python3.9[225445]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:53 np0005626463.localdomain sudo[225443]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:53 np0005626463.localdomain sudo[225553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxrefbfhasrgwbaexhqmgtxqlmcwocyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838813.2988062-2140-106284323306616/AnsiballZ_file.py
Feb 23 09:26:53 np0005626463.localdomain sudo[225553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:26:53 np0005626463.localdomain python3.9[225555]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:26:53 np0005626463.localdomain sudo[225553]: pam_unix(sudo:session): session closed for user root
Feb 23 09:26:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57105 DF PROTO=TCP SPT=44898 DPT=9102 SEQ=3126667153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF215900000000001030307) 
Feb 23 09:26:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7107 DF PROTO=TCP SPT=52182 DPT=9882 SEQ=3071906528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF218060000000001030307) 
Feb 23 09:26:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:26:55 np0005626463.localdomain podman[225573]: 2026-02-23 09:26:55.913916223 +0000 UTC m=+0.083457423 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent)
Feb 23 09:26:55 np0005626463.localdomain podman[225573]: 2026-02-23 09:26:55.943072989 +0000 UTC m=+0.112614179 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:26:55 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:26:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57107 DF PROTO=TCP SPT=44898 DPT=9102 SEQ=3126667153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF221870000000001030307) 
Feb 23 09:26:59 np0005626463.localdomain sshd[225590]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:27:00 np0005626463.localdomain sshd[225590]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:27:00 np0005626463.localdomain sudo[225682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixstivtgrymrgamwartqysgkerotbgat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838820.1226919-2505-174366134562143/AnsiballZ_getent.py
Feb 23 09:27:00 np0005626463.localdomain sudo[225682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:00 np0005626463.localdomain python3.9[225684]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 23 09:27:00 np0005626463.localdomain sudo[225682]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43285 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF230060000000001030307) 
Feb 23 09:27:02 np0005626463.localdomain sudo[225793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahfbdpfwoavwwwetowxiydgjwpadskbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838820.9104655-2529-251855057483579/AnsiballZ_group.py
Feb 23 09:27:02 np0005626463.localdomain sudo[225793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:02 np0005626463.localdomain python3.9[225795]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 09:27:02 np0005626463.localdomain groupadd[225796]: group added to /etc/group: name=nova, GID=42436
Feb 23 09:27:02 np0005626463.localdomain groupadd[225796]: group added to /etc/gshadow: name=nova
Feb 23 09:27:02 np0005626463.localdomain groupadd[225796]: new group: name=nova, GID=42436
Feb 23 09:27:02 np0005626463.localdomain sudo[225793]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43286 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF238060000000001030307) 
Feb 23 09:27:03 np0005626463.localdomain sudo[225909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxxoczqvhqmskutdevzufynkhlkanjax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838822.6886096-2553-9428709399625/AnsiballZ_user.py
Feb 23 09:27:03 np0005626463.localdomain sudo[225909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:03 np0005626463.localdomain python3.9[225911]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 09:27:03 np0005626463.localdomain useradd[225913]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 23 09:27:03 np0005626463.localdomain useradd[225913]: add 'nova' to group 'libvirt'
Feb 23 09:27:03 np0005626463.localdomain useradd[225913]: add 'nova' to shadow group 'libvirt'
Feb 23 09:27:03 np0005626463.localdomain sudo[225909]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:04 np0005626463.localdomain sshd[225937]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:27:04 np0005626463.localdomain sshd[225937]: Accepted publickey for zuul from 192.168.122.30 port 48612 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:27:04 np0005626463.localdomain systemd-logind[759]: New session 54 of user zuul.
Feb 23 09:27:04 np0005626463.localdomain systemd[1]: Started Session 54 of User zuul.
Feb 23 09:27:04 np0005626463.localdomain sshd[225937]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:27:04 np0005626463.localdomain sshd[225940]: Received disconnect from 192.168.122.30 port 48612:11: disconnected by user
Feb 23 09:27:04 np0005626463.localdomain sshd[225940]: Disconnected from user zuul 192.168.122.30 port 48612
Feb 23 09:27:04 np0005626463.localdomain sshd[225937]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:27:04 np0005626463.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Feb 23 09:27:04 np0005626463.localdomain systemd-logind[759]: Session 54 logged out. Waiting for processes to exit.
Feb 23 09:27:04 np0005626463.localdomain systemd-logind[759]: Removed session 54.
Feb 23 09:27:05 np0005626463.localdomain python3.9[226048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:05 np0005626463.localdomain python3.9[226103]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14873 DF PROTO=TCP SPT=55968 DPT=9101 SEQ=144703007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF245060000000001030307) 
Feb 23 09:27:06 np0005626463.localdomain python3.9[226211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:07 np0005626463.localdomain python3.9[226297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838826.1049607-2628-140247797123933/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:07 np0005626463.localdomain python3.9[226405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:08 np0005626463.localdomain python3.9[226491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838827.1898627-2628-180094893805398/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:08 np0005626463.localdomain python3.9[226599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:09 np0005626463.localdomain python3.9[226685]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838828.2673264-2628-30575015824456/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31258 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF251420000000001030307) 
Feb 23 09:27:09 np0005626463.localdomain python3.9[226793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:10 np0005626463.localdomain python3.9[226879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838829.352772-2790-3128833922551/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=3012482a375a6db0cadffa2656b647c3720d54e9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:10 np0005626463.localdomain sudo[226987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-judlruirtljuomizonyjmftckqrnrwjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838830.4848378-2835-244078872408628/AnsiballZ_file.py
Feb 23 09:27:10 np0005626463.localdomain sudo[226987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:10 np0005626463.localdomain python3.9[226989]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:10 np0005626463.localdomain sudo[226987]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:11 np0005626463.localdomain sudo[227097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujcujzgalmpuhzblbzzbrdutxmmwuzgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838831.1366842-2859-163699966191795/AnsiballZ_copy.py
Feb 23 09:27:11 np0005626463.localdomain sudo[227097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:11 np0005626463.localdomain python3.9[227099]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:11 np0005626463.localdomain sudo[227097]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:12 np0005626463.localdomain sudo[227207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpdrvoaxipzjdccthtpxixdazehuoieu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838831.794234-2883-279437286843174/AnsiballZ_stat.py
Feb 23 09:27:12 np0005626463.localdomain sudo[227207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:12 np0005626463.localdomain python3.9[227209]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:12 np0005626463.localdomain sudo[227207]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31260 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF25D460000000001030307) 
Feb 23 09:27:12 np0005626463.localdomain sudo[227319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umgfyfycisqunevfptylegjkugzflcpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838832.5162535-2910-278175865704492/AnsiballZ_file.py
Feb 23 09:27:12 np0005626463.localdomain sudo[227319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:12 np0005626463.localdomain python3.9[227321]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:12 np0005626463.localdomain sudo[227319]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:14 np0005626463.localdomain python3.9[227429]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:14 np0005626463.localdomain sudo[227539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbmqmrcliljwlbltfucbdhnjlpyjnnfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838834.6698914-2967-133509408893188/AnsiballZ_file.py
Feb 23 09:27:14 np0005626463.localdomain sudo[227539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:15 np0005626463.localdomain python3.9[227541]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:15 np0005626463.localdomain sudo[227539]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43288 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF268070000000001030307) 
Feb 23 09:27:16 np0005626463.localdomain sudo[227649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbgmxsdflvveazdfkeaheswcxwqcjjfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838835.8847587-2991-70360674513719/AnsiballZ_file.py
Feb 23 09:27:16 np0005626463.localdomain sudo[227649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:16 np0005626463.localdomain python3.9[227651]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:16 np0005626463.localdomain sudo[227649]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:17 np0005626463.localdomain python3.9[227759]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14875 DF PROTO=TCP SPT=55968 DPT=9101 SEQ=144703007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF276060000000001030307) 
Feb 23 09:27:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:27:18 np0005626463.localdomain podman[227971]: 2026-02-23 09:27:18.916146061 +0000 UTC m=+0.089663169 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 23 09:27:18 np0005626463.localdomain podman[227971]: 2026-02-23 09:27:18.994266624 +0000 UTC m=+0.167783732 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 09:27:19 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:27:19 np0005626463.localdomain sudo[228085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxmcrxtmilafordvoytzwtgownmitobi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838838.8722353-3093-92568157557175/AnsiballZ_container_config_data.py
Feb 23 09:27:19 np0005626463.localdomain sudo[228085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:19 np0005626463.localdomain python3.9[228087]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 23 09:27:19 np0005626463.localdomain sudo[228085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:20 np0005626463.localdomain sudo[228195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epgasdtapgajdixlbgolknwnthdpfqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838839.9623597-3126-195874631273443/AnsiballZ_container_config_hash.py
Feb 23 09:27:20 np0005626463.localdomain sudo[228195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:20 np0005626463.localdomain python3.9[228197]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:27:20 np0005626463.localdomain sudo[228195]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:21 np0005626463.localdomain sudo[228305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuwbkuaakpxsihfpgyfjqjetnhaqjsyq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838840.9500911-3156-159735603432334/AnsiballZ_edpm_container_manage.py
Feb 23 09:27:21 np0005626463.localdomain sudo[228305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:21 np0005626463.localdomain python3[228307]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:27:21 np0005626463.localdomain podman[228344]: 
Feb 23 09:27:21 np0005626463.localdomain podman[228344]: 2026-02-23 09:27:21.948486131 +0000 UTC m=+0.088493823 container create 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:27:21 np0005626463.localdomain podman[228344]: 2026-02-23 09:27:21.903557049 +0000 UTC m=+0.043564761 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:27:21 np0005626463.localdomain python3[228307]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 23 09:27:22 np0005626463.localdomain sudo[228305]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:22 np0005626463.localdomain sudo[228487]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amcbknkfephbdfwbhvgnyutxwbhvbzfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838842.4857643-3180-161962018010605/AnsiballZ_stat.py
Feb 23 09:27:22 np0005626463.localdomain sudo[228487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:22 np0005626463.localdomain python3.9[228489]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:22 np0005626463.localdomain sudo[228487]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9776 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF28AC00000000001030307) 
Feb 23 09:27:24 np0005626463.localdomain python3.9[228599]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:27:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31262 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF28E070000000001030307) 
Feb 23 09:27:25 np0005626463.localdomain sudo[228707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxgaqdmbuscobgnzhyotpqqkrtglwvka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838844.8507292-3261-26667132994929/AnsiballZ_stat.py
Feb 23 09:27:25 np0005626463.localdomain sudo[228707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:25 np0005626463.localdomain python3.9[228709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:25 np0005626463.localdomain sudo[228707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:25 np0005626463.localdomain sudo[228797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzhacmpkqswckqdilbsjtcfxryhclcoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838844.8507292-3261-26667132994929/AnsiballZ_copy.py
Feb 23 09:27:25 np0005626463.localdomain sudo[228797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:25 np0005626463.localdomain python3.9[228799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838844.8507292-3261-26667132994929/.source.yaml _original_basename=.nmikqgrf follow=False checksum=dde8f4b0d63c380bd7f7596e7df827a8064c101b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:25 np0005626463.localdomain sudo[228797]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:26 np0005626463.localdomain sudo[228907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cerqjcarklfobpyysathutwzaldapcoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838846.3899336-3312-166357179594572/AnsiballZ_file.py
Feb 23 09:27:26 np0005626463.localdomain sudo[228907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:27:26 np0005626463.localdomain podman[228910]: 2026-02-23 09:27:26.764048404 +0000 UTC m=+0.077347370 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 23 09:27:26 np0005626463.localdomain podman[228910]: 2026-02-23 09:27:26.795241287 +0000 UTC m=+0.108540233 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:27:26 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:27:26 np0005626463.localdomain python3.9[228909]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:26 np0005626463.localdomain sudo[228907]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:27 np0005626463.localdomain sudo[228940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:27:27 np0005626463.localdomain sudo[228940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:27:27 np0005626463.localdomain sudo[228940]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:27 np0005626463.localdomain sudo[228977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:27:27 np0005626463.localdomain sudo[228977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:27:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9778 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF296C70000000001030307) 
Feb 23 09:27:27 np0005626463.localdomain sudo[229069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwnuixzlqfmymavgzweirxblqspzheuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838847.048551-3336-60579750834291/AnsiballZ_file.py
Feb 23 09:27:27 np0005626463.localdomain sudo[229069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:27 np0005626463.localdomain python3.9[229071]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:27:27 np0005626463.localdomain sudo[229069]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:27 np0005626463.localdomain sudo[228977]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:27 np0005626463.localdomain sudo[229211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybvrpqnxckvjdbgagkoznxqzcqlfelie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838847.7595446-3360-270404393237033/AnsiballZ_stat.py
Feb 23 09:27:27 np0005626463.localdomain sudo[229211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:28 np0005626463.localdomain python3.9[229213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:28 np0005626463.localdomain sudo[229211]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:28 np0005626463.localdomain sudo[229216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:27:28 np0005626463.localdomain sudo[229216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:27:28 np0005626463.localdomain sudo[229216]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:28 np0005626463.localdomain sudo[229319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzttmzftlngmmtthoaumoxqyzfwxdgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838847.7595446-3360-270404393237033/AnsiballZ_copy.py
Feb 23 09:27:28 np0005626463.localdomain sudo[229319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:28 np0005626463.localdomain python3.9[229321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838847.7595446-3360-270404393237033/.source.json _original_basename=.qlvxpjcy follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:28 np0005626463.localdomain sudo[229319]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:29 np0005626463.localdomain python3.9[229429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18776 DF PROTO=TCP SPT=39096 DPT=9100 SEQ=2040014006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2A5460000000001030307) 
Feb 23 09:27:32 np0005626463.localdomain sudo[229731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqohopmqbrvxcfrrwkpuzvgjdbjqadrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838852.2509542-3480-228005009676306/AnsiballZ_container_config_data.py
Feb 23 09:27:32 np0005626463.localdomain sudo[229731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:32 np0005626463.localdomain python3.9[229733]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 23 09:27:32 np0005626463.localdomain sudo[229731]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18777 DF PROTO=TCP SPT=39096 DPT=9100 SEQ=2040014006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2AD470000000001030307) 
Feb 23 09:27:33 np0005626463.localdomain sshd[229751]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:27:33 np0005626463.localdomain sudo[229843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upqrbrjsmaqgvlxewajukfhqimivdbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838853.210696-3513-73095807175891/AnsiballZ_container_config_hash.py
Feb 23 09:27:33 np0005626463.localdomain sudo[229843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:33 np0005626463.localdomain sshd[229751]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:27:33 np0005626463.localdomain python3.9[229845]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:27:33 np0005626463.localdomain sudo[229843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:34 np0005626463.localdomain sudo[229953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojrqnywqkkyxkqbhgijwnkofmofpobpq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838854.2782528-3543-246817450031858/AnsiballZ_edpm_container_manage.py
Feb 23 09:27:34 np0005626463.localdomain sudo[229953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:34 np0005626463.localdomain sshd[229956]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:27:34 np0005626463.localdomain python3[229955]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:27:35 np0005626463.localdomain python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",
                                                                    "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:27:42.035349623Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1216089983,
                                                                    "VirtualSize": 1216089983,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",
                                                                              "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",
                                                                              "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:21.746093163Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:58.628150825Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:01.105956567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:14.411074144Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:16.679986066Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:17.108151361Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:25.131733428Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:29.831104887Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:11.726944348Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.361948209Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.720772563Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.031893078Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.03195279Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:47.46658157Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:27:35 np0005626463.localdomain podman[230005]: 2026-02-23 09:27:35.204022524 +0000 UTC m=+0.096243385 container remove c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-type=git, container_name=nova_compute, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, version=17.1.13)
Feb 23 09:27:35 np0005626463.localdomain python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Feb 23 09:27:35 np0005626463.localdomain podman[230019]: 
Feb 23 09:27:35 np0005626463.localdomain podman[230019]: 2026-02-23 09:27:35.315530165 +0000 UTC m=+0.090207595 container create 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.43.0, config_id=nova_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:27:35 np0005626463.localdomain podman[230019]: 2026-02-23 09:27:35.274637763 +0000 UTC m=+0.049315243 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:27:35 np0005626463.localdomain python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 23 09:27:35 np0005626463.localdomain sudo[229953]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:35 np0005626463.localdomain sshd[229956]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:27:35 np0005626463.localdomain sudo[230164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgqxxotrqxtbszrogttcchgnpcaiwhva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838855.6648526-3567-76865406935786/AnsiballZ_stat.py
Feb 23 09:27:35 np0005626463.localdomain sudo[230164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:36 np0005626463.localdomain python3.9[230166]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:36 np0005626463.localdomain sudo[230164]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29997 DF PROTO=TCP SPT=51930 DPT=9101 SEQ=2185825708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2BA070000000001030307) 
Feb 23 09:27:36 np0005626463.localdomain sudo[230276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obzgqnqhnrgpyklfagbolyldwxdlcdjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838856.4587324-3594-117568781437862/AnsiballZ_file.py
Feb 23 09:27:36 np0005626463.localdomain sudo[230276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:36 np0005626463.localdomain python3.9[230278]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:36 np0005626463.localdomain sudo[230276]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:37 np0005626463.localdomain sudo[230331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frbyiepdguxlruymmqaiwmqaatcrnncc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838856.4587324-3594-117568781437862/AnsiballZ_stat.py
Feb 23 09:27:37 np0005626463.localdomain sudo[230331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:37 np0005626463.localdomain python3.9[230333]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:37 np0005626463.localdomain sudo[230331]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:37 np0005626463.localdomain sudo[230440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxcwfplsgfvzjztfzocyivmgxtnsydiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838857.405786-3594-3872314354872/AnsiballZ_copy.py
Feb 23 09:27:37 np0005626463.localdomain sudo[230440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:37 np0005626463.localdomain python3.9[230442]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838857.405786-3594-3872314354872/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:38 np0005626463.localdomain sudo[230440]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:38 np0005626463.localdomain sudo[230495]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvgmozjqddtfkcplmmeqpjvkpxamfcea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838857.405786-3594-3872314354872/AnsiballZ_systemd.py
Feb 23 09:27:38 np0005626463.localdomain sudo[230495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:38 np0005626463.localdomain python3.9[230497]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:27:38 np0005626463.localdomain systemd-rc-local-generator[230523]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:27:38 np0005626463.localdomain systemd-sysv-generator[230527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:38 np0005626463.localdomain sudo[230495]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:39 np0005626463.localdomain sudo[230587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olyxleraxgkuxuliktlpmpmsfizmkzno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838857.405786-3594-3872314354872/AnsiballZ_systemd.py
Feb 23 09:27:39 np0005626463.localdomain sudo[230587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9780 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2C6070000000001030307) 
Feb 23 09:27:39 np0005626463.localdomain python3.9[230589]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:27:40 np0005626463.localdomain systemd-sysv-generator[230621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:27:40 np0005626463.localdomain systemd-rc-local-generator[230616]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:27:40 np0005626463.localdomain systemd[1]: Starting nova_compute container...
Feb 23 09:27:41 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:27:41 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:41 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:41 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:41 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:41 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:41 np0005626463.localdomain podman[230630]: 2026-02-23 09:27:41.073056103 +0000 UTC m=+0.133058675 container init 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, config_id=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:27:41 np0005626463.localdomain podman[230630]: 2026-02-23 09:27:41.083109403 +0000 UTC m=+0.143111975 container start 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:27:41 np0005626463.localdomain podman[230630]: nova_compute
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + sudo -E kolla_set_configs
Feb 23 09:27:41 np0005626463.localdomain systemd[1]: Started nova_compute container.
Feb 23 09:27:41 np0005626463.localdomain sudo[230587]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Validating config file
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying service configuration files
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Deleting /etc/ceph
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Creating directory /etc/ceph
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Writing out command to execute
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: ++ cat /run_command
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + CMD=nova-compute
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + ARGS=
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + sudo kolla_copy_cacerts
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + [[ ! -n '' ]]
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + . kolla_extend_start
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: Running command: 'nova-compute'
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + umask 0022
Feb 23 09:27:41 np0005626463.localdomain nova_compute[230643]: + exec nova-compute
Feb 23 09:27:41 np0005626463.localdomain python3.9[230762]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:27:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29410 DF PROTO=TCP SPT=39054 DPT=9882 SEQ=3387491264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2D2860000000001030307) 
Feb 23 09:27:42 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:42.875 230647 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:42 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:42.876 230647 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:42 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:42.876 230647 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:42 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:42.876 230647 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.001 230647 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.024 230647 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.025 230647 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 09:27:43 np0005626463.localdomain sudo[230875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izijhvxjkfmjjoeorlyuqtrhclblseaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838863.1308303-3729-25537200188804/AnsiballZ_stat.py
Feb 23 09:27:43 np0005626463.localdomain sudo[230875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.514 230647 INFO nova.virt.driver [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.629 230647 INFO nova.compute.provider_config [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 09:27:43 np0005626463.localdomain python3.9[230877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.644 230647 WARNING nova.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console_host                   = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain sudo[230875]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.665 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.665 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.701 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.701 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 WARNING oslo_config.cfg [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: and ``live_migration_inbound_addr`` respectively.
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: ).  Its value may be silently ignored in the future.
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_secret_uuid        = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.805 230647 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.836 230647 INFO nova.virt.node [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.837 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.848 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f76a68f1220> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.850 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f76a68f1220> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.851 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Connection event '1' reason 'None'
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.864 230647 DEBUG nova.virt.libvirt.volume.mount [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.873 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <host>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <uuid>bdcaa433-cfc7-450a-99ab-f0985ab59447</uuid>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <cpu>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <arch>x86_64</arch>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model>EPYC-Rome-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <vendor>AMD</vendor>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <microcode version='16777317'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <signature family='23' model='49' stepping='0'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='x2apic'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='tsc-deadline'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='osxsave'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='hypervisor'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='tsc_adjust'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='spec-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='stibp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='arch-capabilities'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='cmp_legacy'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='topoext'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='virt-ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='lbrv'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='tsc-scale'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='vmcb-clean'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='pause-filter'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='pfthreshold'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='svme-addr-chk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='rdctl-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='mds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature name='pschange-mc-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <pages unit='KiB' size='4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <pages unit='KiB' size='2048'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <pages unit='KiB' size='1048576'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </cpu>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <power_management>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <suspend_mem/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <suspend_disk/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <suspend_hybrid/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </power_management>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <iommu support='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <migration_features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <live/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <uri_transports>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <uri_transport>tcp</uri_transport>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <uri_transport>rdma</uri_transport>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </uri_transports>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </migration_features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <topology>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <cells num='1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <cell id='0'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <memory unit='KiB'>16116612</memory>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <distances>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <sibling id='0' value='10'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           </distances>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           <cpus num='8'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:           </cpus>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         </cell>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </cells>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </topology>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <cache>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </cache>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <secmodel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model>selinux</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <doi>0</doi>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </secmodel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <secmodel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model>dac</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <doi>0</doi>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </secmodel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </host>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <guest>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <os_type>hvm</os_type>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <arch name='i686'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <wordsize>32</wordsize>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <domain type='qemu'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <domain type='kvm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </arch>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <pae/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <nonpae/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <acpi default='on' toggle='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <apic default='on' toggle='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <cpuselection/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <deviceboot/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <externalSnapshot/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </guest>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <guest>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <os_type>hvm</os_type>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <arch name='x86_64'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <wordsize>64</wordsize>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <domain type='qemu'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <domain type='kvm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </arch>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <acpi default='on' toggle='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <apic default='on' toggle='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <cpuselection/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <deviceboot/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <externalSnapshot/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </guest>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: </capabilities>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.878 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.895 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: <domainCapabilities>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <domain>kvm</domain>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <arch>i686</arch>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <vcpu max='1024'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <iothreads supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <os supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <enum name='firmware'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <loader supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>rom</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pflash</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='readonly'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>yes</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='secure'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </loader>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </os>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <cpu>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='maximumMigratable'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <vendor>AMD</vendor>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='succor'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='custom' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </cpu>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <memoryBacking supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <enum name='sourceType'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <value>file</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <value>anonymous</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <value>memfd</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </memoryBacking>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <devices>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <disk supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='diskDevice'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>disk</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>cdrom</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>floppy</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>lun</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>fdc</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>sata</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </disk>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <graphics supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vnc</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>egl-headless</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </graphics>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <video supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='modelType'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vga</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>cirrus</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>none</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>bochs</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>ramfb</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </video>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <hostdev supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='mode'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>subsystem</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='startupPolicy'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>mandatory</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>requisite</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>optional</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='subsysType'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pci</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='capsType'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='pciBackend'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </hostdev>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <rng supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>random</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>egd</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </rng>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <filesystem supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='driverType'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>path</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>handle</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>virtiofs</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </filesystem>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <tpm supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>tpm-tis</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>tpm-crb</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>emulator</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>external</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='backendVersion'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>2.0</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </tpm>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <redirdev supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </redirdev>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <channel supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </channel>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <crypto supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='model'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>qemu</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </crypto>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <interface supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='backendType'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>passt</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </interface>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <panic supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>isa</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>hyperv</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </panic>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <console supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>null</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vc</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>dev</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>file</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pipe</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>stdio</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>udp</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>tcp</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>qemu-vdagent</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </console>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </devices>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <gic supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <genid supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <backup supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <async-teardown supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <s390-pv supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <ps2 supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <tdx supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <sev supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <sgx supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <hyperv supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='features'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>relaxed</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vapic</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>spinlocks</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vpindex</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>runtime</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>synic</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>stimer</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>reset</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>vendor_id</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>frequencies</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>reenlightenment</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>tlbflush</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>ipi</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>avic</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>emsr_bitmap</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>xmm_input</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <defaults>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </defaults>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </hyperv>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <launchSecurity supported='no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </features>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: </domainCapabilities>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.901 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]: <domainCapabilities>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <domain>kvm</domain>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <arch>i686</arch>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <vcpu max='240'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <iothreads supported='yes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <os supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <enum name='firmware'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <loader supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>rom</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>pflash</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='readonly'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>yes</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='secure'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </loader>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   </os>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:   <cpu>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <enum name='maximumMigratable'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <vendor>AMD</vendor>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='succor'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:     <mode name='custom' supported='yes'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain sudo[230990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hplxbqnnqffqrxdedxiflqkmiootontb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838863.1308303-3729-25537200188804/AnsiballZ_copy.py
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain sudo[230990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:43 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </cpu>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <memoryBacking supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <enum name='sourceType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>anonymous</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>memfd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </memoryBacking>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <disk supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='diskDevice'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>disk</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cdrom</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>floppy</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>lun</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ide</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>fdc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>sata</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </disk>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <graphics supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vnc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egl-headless</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </graphics>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <video supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='modelType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vga</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cirrus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>none</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>bochs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ramfb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </video>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hostdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='mode'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>subsystem</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='startupPolicy'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>mandatory</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>requisite</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>optional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='subsysType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pci</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='capsType'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='pciBackend'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hostdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <rng supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>random</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </rng>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <filesystem supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='driverType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>path</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>handle</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtiofs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </filesystem>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tpm supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-tis</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-crb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emulator</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>external</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendVersion'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>2.0</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </tpm>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <redirdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </redirdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <channel supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </channel>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <crypto supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </crypto>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <interface supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>passt</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </interface>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <panic supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>isa</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>hyperv</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </panic>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <console supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>null</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dev</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pipe</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stdio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>udp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tcp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu-vdagent</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </console>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <gic supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <genid supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backup supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <async-teardown supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <s390-pv supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <ps2 supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tdx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sev supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sgx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hyperv supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='features'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>relaxed</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vapic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>spinlocks</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vpindex</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>runtime</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>synic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stimer</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reset</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vendor_id</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>frequencies</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reenlightenment</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tlbflush</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ipi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>avic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emsr_bitmap</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>xmm_input</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hyperv>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <launchSecurity supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: </domainCapabilities>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.962 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:43.968 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: <domainCapabilities>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <domain>kvm</domain>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <arch>x86_64</arch>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <vcpu max='1024'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <iothreads supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <os supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <enum name='firmware'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>efi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <loader supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>rom</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pflash</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='readonly'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>yes</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='secure'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>yes</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </loader>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </os>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <cpu>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='maximumMigratable'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <vendor>AMD</vendor>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='succor'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='custom' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </cpu>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <memoryBacking supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <enum name='sourceType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>anonymous</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>memfd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </memoryBacking>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <disk supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='diskDevice'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>disk</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cdrom</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>floppy</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>lun</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>fdc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>sata</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </disk>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <graphics supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vnc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egl-headless</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </graphics>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <video supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='modelType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vga</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cirrus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>none</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>bochs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ramfb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </video>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hostdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='mode'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>subsystem</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='startupPolicy'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>mandatory</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>requisite</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>optional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='subsysType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pci</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='capsType'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='pciBackend'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hostdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <rng supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>random</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </rng>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <filesystem supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='driverType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>path</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>handle</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtiofs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </filesystem>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tpm supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-tis</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-crb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emulator</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>external</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendVersion'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>2.0</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </tpm>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <redirdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </redirdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <channel supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </channel>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <crypto supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </crypto>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <interface supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>passt</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </interface>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <panic supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>isa</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>hyperv</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </panic>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <console supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>null</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dev</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pipe</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stdio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>udp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tcp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu-vdagent</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </console>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <gic supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <genid supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backup supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <async-teardown supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <s390-pv supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <ps2 supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tdx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sev supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sgx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hyperv supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='features'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>relaxed</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vapic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>spinlocks</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vpindex</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>runtime</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>synic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stimer</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reset</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vendor_id</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>frequencies</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reenlightenment</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tlbflush</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ipi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>avic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emsr_bitmap</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>xmm_input</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hyperv>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <launchSecurity supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: </domainCapabilities>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.042 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: <domainCapabilities>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <domain>kvm</domain>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <arch>x86_64</arch>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <vcpu max='240'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <iothreads supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <os supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <enum name='firmware'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <loader supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>rom</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pflash</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='readonly'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>yes</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='secure'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>no</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </loader>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </os>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <cpu>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='maximumMigratable'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>on</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>off</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <vendor>AMD</vendor>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='succor'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <mode name='custom' supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ddpd-u'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sha512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm3'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sm4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Denverton-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amd-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='auto-ibrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='perfmon-v2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbpb'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='stibp-always-on'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='EPYC-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-128'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-256'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx10-512'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='prefetchiti'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Haswell-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512er'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512pf'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fma4'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tbm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xop'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='amx-tile'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-bf16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-fp16'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bitalg'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrc'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fzrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='la57'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='taa-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ifma'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cmpccxadd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fbsdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='fsrs'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ibrs-all'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='intel-psfd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='lam'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mcdt-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pbrsb-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='psdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rfds-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='serialize'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vaes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='hle'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='rtm'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512bw'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512cd'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512dq'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512f'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='avx512vl'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='invpcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pcid'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='pku'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='mpx'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='core-capability'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='split-lock-detect'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='cldemote'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='erms'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='gfni'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdir64b'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='movdiri'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='xsaves'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='athlon-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='core2duo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='coreduo-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='n270-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='ss'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <blockers model='phenom-v1'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnow'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <feature name='3dnowext'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </blockers>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </mode>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </cpu>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <memoryBacking supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <enum name='sourceType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>anonymous</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <value>memfd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </memoryBacking>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <disk supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='diskDevice'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>disk</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cdrom</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>floppy</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>lun</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ide</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>fdc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>sata</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </disk>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <graphics supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vnc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egl-headless</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </graphics>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <video supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='modelType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vga</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>cirrus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>none</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>bochs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ramfb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </video>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hostdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='mode'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>subsystem</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='startupPolicy'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>mandatory</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>requisite</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>optional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='subsysType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pci</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>scsi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='capsType'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='pciBackend'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hostdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <rng supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtio-non-transitional</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>random</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>egd</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </rng>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <filesystem supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='driverType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>path</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>handle</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>virtiofs</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </filesystem>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tpm supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-tis</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tpm-crb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emulator</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>external</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendVersion'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>2.0</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </tpm>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <redirdev supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='bus'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>usb</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </redirdev>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <channel supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </channel>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <crypto supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendModel'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>builtin</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </crypto>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <interface supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='backendType'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>default</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>passt</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </interface>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <panic supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='model'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>isa</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>hyperv</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </panic>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <console supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='type'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>null</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vc</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pty</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dev</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>file</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>pipe</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stdio</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>udp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tcp</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>unix</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>qemu-vdagent</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>dbus</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </console>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </devices>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   <features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <gic supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <genid supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <backup supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <async-teardown supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <s390-pv supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <ps2 supported='yes'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <tdx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sev supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <sgx supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <hyperv supported='yes'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <enum name='features'>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>relaxed</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vapic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>spinlocks</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vpindex</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>runtime</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>synic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>stimer</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reset</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>vendor_id</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>frequencies</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>reenlightenment</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>tlbflush</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>ipi</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>avic</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>emsr_bitmap</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <value>xmm_input</value>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </enum>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       <defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:       </defaults>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     </hyperv>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:     <launchSecurity supported='no'/>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:   </features>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: </domainCapabilities>
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.106 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.107 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Secure Boot support detected
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.108 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.109 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.122 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.172 230647 INFO nova.virt.node [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.200 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 23 09:27:44 np0005626463.localdomain python3.9[230992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838863.1308303-3729-25537200188804/.source.yaml _original_basename=.b5f3od9l follow=False checksum=c0274b4e8da702f77c15fa25b71400f0f9b8a680 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:27:44 np0005626463.localdomain sudo[230990]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.250 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.254 230647 DEBUG nova.virt.libvirt.vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005626463.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.254 230647 DEBUG nova.network.os_vif_util [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.255 230647 DEBUG nova.network.os_vif_util [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.255 230647 DEBUG os_vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.316 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.318 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.330 230647 INFO oslo.privsep.daemon [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpj4hijz_z/privsep.sock']
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.948 230647 INFO oslo.privsep.daemon [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.827 231014 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.832 231014 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.835 231014 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 23 09:27:44 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:44.835 231014 INFO oslo.privsep.daemon [-] privsep daemon running as pid 231014
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.231 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.232 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.232 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.233 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.234 230647 INFO os_vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.234 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.238 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.239 230647 INFO nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 09:27:45 np0005626463.localdomain python3.9[231108]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38480 DF PROTO=TCP SPT=41330 DPT=9105 SEQ=392615369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2DE060000000001030307) 
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.818 230647 INFO nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating service version for nova-compute on np0005626463.localdomain from 57 to 66
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:27:45 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:45.865 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.262 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:46 np0005626463.localdomain python3.9[231236]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.339 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.339 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:27:46 np0005626463.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.680 230647 WARNING nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12922MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.819 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.819 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.820 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.836 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.907 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.907 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.927 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.950 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_SSSE3,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SHA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:27:46 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:46.988 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:47 np0005626463.localdomain python3.9[231369]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.467 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.494 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.500 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.500 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] kernel doesn't support AMD SEV
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.502 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.503 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.564 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updated inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.565 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.565 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.645 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.728 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.728 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.729 230647 DEBUG nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.826 230647 DEBUG nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 09:27:47 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:47.827 230647 DEBUG nova.servicegroup.drivers.db [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = <Service: host=np0005626463.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 09:27:47 np0005626463.localdomain sudo[231499]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzktpqjrrcsorvfslgowsoouyfytdvuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838867.3872862-3879-195640983937143/AnsiballZ_podman_container.py
Feb 23 09:27:47 np0005626463.localdomain sudo[231499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:48 np0005626463.localdomain python3.9[231501]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 09:27:48 np0005626463.localdomain sudo[231499]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:48 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 121.6 (405 of 333 items), suggesting rotation.
Feb 23 09:27:48 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:27:48 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:27:48 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:27:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29999 DF PROTO=TCP SPT=51930 DPT=9101 SEQ=2185825708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2EA070000000001030307) 
Feb 23 09:27:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:27:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:27:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:27:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:27:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:27:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:27:48 np0005626463.localdomain sudo[231632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgpvjenkmypduinvxyexzbcnmodovydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838868.467452-3903-1926531162821/AnsiballZ_systemd.py
Feb 23 09:27:48 np0005626463.localdomain sudo[231632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:49 np0005626463.localdomain python3.9[231634]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:27:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:27:49 np0005626463.localdomain systemd[1]: Stopping nova_compute container...
Feb 23 09:27:49 np0005626463.localdomain systemd[1]: tmp-crun.NXrBbM.mount: Deactivated successfully.
Feb 23 09:27:49 np0005626463.localdomain podman[231636]: 2026-02-23 09:27:49.202970542 +0000 UTC m=+0.097130801 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.272 230647 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Feb 23 09:27:49 np0005626463.localdomain podman[231636]: 2026-02-23 09:27:49.307384431 +0000 UTC m=+0.201544730 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.318 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:49 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.904 230647 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.906 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.907 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:27:49 np0005626463.localdomain nova_compute[230643]: 2026-02-23 09:27:49.907 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:27:50 np0005626463.localdomain virtqemud[207530]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 23 09:27:50 np0005626463.localdomain virtqemud[207530]: hostname: np0005626463.localdomain
Feb 23 09:27:50 np0005626463.localdomain virtqemud[207530]: End of file while reading data: Input/output error
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Deactivated successfully.
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Consumed 4.822s CPU time.
Feb 23 09:27:50 np0005626463.localdomain podman[231644]: 2026-02-23 09:27:50.360825066 +0000 UTC m=+1.229195845 container died 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, io.buildah.version=1.43.0, tcib_managed=true, container_name=nova_compute)
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: tmp-crun.zKRho6.mount: Deactivated successfully.
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d-userdata-shm.mount: Deactivated successfully.
Feb 23 09:27:50 np0005626463.localdomain podman[231644]: 2026-02-23 09:27:50.416756727 +0000 UTC m=+1.285127506 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:27:50 np0005626463.localdomain podman[231644]: nova_compute
Feb 23 09:27:50 np0005626463.localdomain podman[231703]: error opening file `/run/crun/8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d/status`: No such file or directory
Feb 23 09:27:50 np0005626463.localdomain podman[231692]: 2026-02-23 09:27:50.518076153 +0000 UTC m=+0.066902480 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=nova_compute)
Feb 23 09:27:50 np0005626463.localdomain podman[231692]: nova_compute
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: Stopped nova_compute container.
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: Starting nova_compute container...
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:27:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:50 np0005626463.localdomain podman[231705]: 2026-02-23 09:27:50.661089124 +0000 UTC m=+0.115333525 container init 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:27:50 np0005626463.localdomain podman[231705]: 2026-02-23 09:27:50.671609339 +0000 UTC m=+0.125853740 container start 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, io.buildah.version=1.43.0, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Feb 23 09:27:50 np0005626463.localdomain podman[231705]: nova_compute
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + sudo -E kolla_set_configs
Feb 23 09:27:50 np0005626463.localdomain systemd[1]: Started nova_compute container.
Feb 23 09:27:50 np0005626463.localdomain sudo[231632]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Validating config file
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying service configuration files
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /etc/ceph
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Creating directory /etc/ceph
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Writing out command to execute
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: ++ cat /run_command
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + CMD=nova-compute
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + ARGS=
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + sudo kolla_copy_cacerts
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + [[ ! -n '' ]]
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + . kolla_extend_start
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: Running command: 'nova-compute'
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + umask 0022
Feb 23 09:27:50 np0005626463.localdomain nova_compute[231721]: + exec nova-compute
Feb 23 09:27:51 np0005626463.localdomain sudo[231841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ascewgepwjpqfigewguxapjkmkrilemd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838871.5658872-3930-278133699281124/AnsiballZ_podman_container.py
Feb 23 09:27:51 np0005626463.localdomain sudo[231841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:27:52 np0005626463.localdomain python3.9[231843]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: Started libpod-conmon-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope.
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:27:52 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:52 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:52 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.380 231725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.381 231725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.381 231725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.381 231725 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 09:27:52 np0005626463.localdomain podman[231864]: 2026-02-23 09:27:52.38883004 +0000 UTC m=+0.170688110 container init 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:27:52 np0005626463.localdomain podman[231864]: 2026-02-23 09:27:52.397083756 +0000 UTC m=+0.178941826 container start 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:27:52 np0005626463.localdomain python3.9[231843]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Applying nova statedir ownership
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/console.log
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/b81db1e2a8e54083d8c4b030cc59287a706969ae
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-b81db1e2a8e54083d8c4b030cc59287a706969ae
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/f23138a46bc477ec40b895db4322b27384fbb01ccd8da7395c9877132dfb82af
Feb 23 09:27:52 np0005626463.localdomain nova_compute_init[231887]: INFO:nova_statedir:Nova statedir ownership complete
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: libpod-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope: Deactivated successfully.
Feb 23 09:27:52 np0005626463.localdomain podman[231888]: 2026-02-23 09:27:52.468033155 +0000 UTC m=+0.053111067 container died 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0)
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.501 231725 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.531 231725 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.532 231725 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa-userdata-shm.mount: Deactivated successfully.
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6-merged.mount: Deactivated successfully.
Feb 23 09:27:52 np0005626463.localdomain podman[231899]: 2026-02-23 09:27:52.583069601 +0000 UTC m=+0.112912613 container cleanup 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=nova_compute_init, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:27:52 np0005626463.localdomain systemd[1]: libpod-conmon-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope: Deactivated successfully.
Feb 23 09:27:52 np0005626463.localdomain sudo[231841]: pam_unix(sudo:session): session closed for user root
Feb 23 09:27:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:52.924 231725 INFO nova.virt.driver [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.053 231725 INFO nova.compute.provider_config [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.060 231725 WARNING nova.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console_host                   = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.130 231725 WARNING oslo_config.cfg [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: and ``live_migration_inbound_addr`` respectively.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: ).  Its value may be silently ignored in the future.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_secret_uuid        = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.190 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.190 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.191 231725 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.209 231725 INFO nova.virt.node [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.211 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.225 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f02b3b79fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.227 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f02b3b79fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.228 231725 INFO nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Connection event '1' reason 'None'
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.235 231725 INFO nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <host>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <uuid>bdcaa433-cfc7-450a-99ab-f0985ab59447</uuid>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <arch>x86_64</arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model>EPYC-Rome-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <vendor>AMD</vendor>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <microcode version='16777317'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <signature family='23' model='49' stepping='0'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='x2apic'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='tsc-deadline'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='osxsave'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='hypervisor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='tsc_adjust'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='spec-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='stibp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='arch-capabilities'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='cmp_legacy'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='topoext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='virt-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='lbrv'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='tsc-scale'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='vmcb-clean'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='pause-filter'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='pfthreshold'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='svme-addr-chk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='rdctl-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='mds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature name='pschange-mc-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <pages unit='KiB' size='4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <pages unit='KiB' size='2048'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <pages unit='KiB' size='1048576'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <power_management>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <suspend_mem/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <suspend_disk/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <suspend_hybrid/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </power_management>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <iommu support='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <migration_features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <live/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <uri_transports>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <uri_transport>tcp</uri_transport>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <uri_transport>rdma</uri_transport>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </uri_transports>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </migration_features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <topology>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <cells num='1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <cell id='0'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <memory unit='KiB'>16116612</memory>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <distances>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <sibling id='0' value='10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           </distances>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           <cpus num='8'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:           </cpus>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         </cell>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </cells>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </topology>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <cache>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </cache>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <secmodel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model>selinux</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <doi>0</doi>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </secmodel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <secmodel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model>dac</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <doi>0</doi>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </secmodel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </host>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <guest>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <os_type>hvm</os_type>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <arch name='i686'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <wordsize>32</wordsize>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <domain type='qemu'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <domain type='kvm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <pae/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <nonpae/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <acpi default='on' toggle='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <apic default='on' toggle='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <cpuselection/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <deviceboot/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <externalSnapshot/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </guest>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <guest>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <os_type>hvm</os_type>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <arch name='x86_64'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <wordsize>64</wordsize>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <domain type='qemu'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <domain type='kvm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <acpi default='on' toggle='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <apic default='on' toggle='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <cpuselection/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <deviceboot/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <externalSnapshot/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </guest>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: </capabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.246 231725 DEBUG nova.virt.libvirt.volume.mount [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.247 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.253 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: <domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <domain>kvm</domain>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <arch>i686</arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <vcpu max='240'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <iothreads supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <os supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='firmware'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <loader supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>rom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pflash</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='readonly'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>yes</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='secure'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </loader>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </os>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='maximumMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <vendor>AMD</vendor>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='succor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='custom' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <memoryBacking supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='sourceType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>anonymous</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>memfd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </memoryBacking>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <disk supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='diskDevice'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>disk</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cdrom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>floppy</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>lun</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ide</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>fdc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>sata</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </disk>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <graphics supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vnc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egl-headless</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </graphics>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <video supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='modelType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vga</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cirrus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>none</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>bochs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ramfb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </video>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hostdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='mode'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>subsystem</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='startupPolicy'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>mandatory</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>requisite</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>optional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='subsysType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pci</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='capsType'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='pciBackend'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hostdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <rng supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>random</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </rng>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <filesystem supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='driverType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>path</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>handle</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtiofs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </filesystem>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tpm supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-tis</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-crb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emulator</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>external</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendVersion'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>2.0</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </tpm>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <redirdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </redirdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <channel supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </channel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <crypto supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </crypto>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <interface supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>passt</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </interface>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <panic supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>isa</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>hyperv</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </panic>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <console supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>null</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dev</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pipe</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stdio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>udp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tcp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu-vdagent</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </console>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <gic supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <genid supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backup supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <async-teardown supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <s390-pv supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <ps2 supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tdx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sev supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sgx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hyperv supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='features'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>relaxed</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vapic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>spinlocks</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vpindex</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>runtime</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>synic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stimer</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reset</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vendor_id</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>frequencies</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reenlightenment</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tlbflush</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ipi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>avic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emsr_bitmap</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>xmm_input</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hyperv>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <launchSecurity supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: </domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.264 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: <domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <domain>kvm</domain>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <arch>i686</arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <vcpu max='1024'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <iothreads supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <os supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='firmware'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <loader supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>rom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pflash</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='readonly'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>yes</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='secure'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </loader>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </os>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='maximumMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <vendor>AMD</vendor>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='succor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='custom' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <memoryBacking supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='sourceType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>anonymous</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>memfd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </memoryBacking>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <disk supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='diskDevice'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>disk</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cdrom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>floppy</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>lun</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>fdc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>sata</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </disk>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <graphics supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vnc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egl-headless</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </graphics>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <video supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='modelType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vga</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cirrus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>none</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>bochs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ramfb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </video>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hostdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='mode'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>subsystem</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='startupPolicy'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>mandatory</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>requisite</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>optional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='subsysType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pci</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='capsType'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='pciBackend'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hostdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <rng supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>random</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </rng>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <filesystem supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='driverType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>path</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>handle</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtiofs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </filesystem>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tpm supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-tis</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-crb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emulator</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>external</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendVersion'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>2.0</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </tpm>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <redirdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </redirdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <channel supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </channel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <crypto supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </crypto>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <interface supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>passt</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </interface>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <panic supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>isa</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>hyperv</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </panic>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <console supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>null</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dev</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pipe</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stdio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>udp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tcp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu-vdagent</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </console>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <gic supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <genid supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backup supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <async-teardown supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <s390-pv supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <ps2 supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tdx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sev supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sgx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hyperv supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='features'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>relaxed</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vapic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>spinlocks</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vpindex</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>runtime</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>synic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stimer</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reset</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vendor_id</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>frequencies</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reenlightenment</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tlbflush</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ipi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>avic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emsr_bitmap</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>xmm_input</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hyperv>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <launchSecurity supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: </domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.318 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.325 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: <domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <domain>kvm</domain>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <arch>x86_64</arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <vcpu max='240'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <iothreads supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <os supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='firmware'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <loader supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>rom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pflash</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='readonly'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>yes</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='secure'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </loader>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </os>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='maximumMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <vendor>AMD</vendor>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='succor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='custom' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <memoryBacking supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='sourceType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>anonymous</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>memfd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </memoryBacking>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <disk supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='diskDevice'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>disk</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cdrom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>floppy</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>lun</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ide</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>fdc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>sata</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </disk>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <graphics supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vnc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egl-headless</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </graphics>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <video supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='modelType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vga</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cirrus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>none</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>bochs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ramfb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </video>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hostdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='mode'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>subsystem</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='startupPolicy'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>mandatory</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>requisite</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>optional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='subsysType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pci</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='capsType'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='pciBackend'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hostdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <rng supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>random</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </rng>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <filesystem supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='driverType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>path</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>handle</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtiofs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </filesystem>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tpm supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-tis</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-crb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emulator</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>external</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendVersion'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>2.0</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </tpm>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <redirdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </redirdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <channel supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </channel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <crypto supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </crypto>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <interface supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>passt</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </interface>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <panic supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>isa</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>hyperv</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </panic>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <console supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>null</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dev</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pipe</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stdio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>udp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tcp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu-vdagent</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </console>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <gic supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <genid supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backup supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <async-teardown supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <s390-pv supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <ps2 supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tdx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sev supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sgx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hyperv supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='features'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>relaxed</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vapic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>spinlocks</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vpindex</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>runtime</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>synic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stimer</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reset</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vendor_id</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>frequencies</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reenlightenment</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tlbflush</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ipi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>avic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emsr_bitmap</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>xmm_input</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hyperv>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <launchSecurity supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: </domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.399 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: <domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <domain>kvm</domain>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <arch>x86_64</arch>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <vcpu max='1024'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <iothreads supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <os supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='firmware'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>efi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <loader supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>rom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pflash</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='readonly'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>yes</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='secure'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>yes</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>no</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </loader>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </os>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='maximum' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='maximumMigratable'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>on</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>off</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='host-model' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <vendor>AMD</vendor>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='x2apic'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='stibp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='succor'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lbrv'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <mode name='custom' supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Broadwell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ddpd-u'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sha512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm3'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sm4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Cooperlake-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Denverton-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Dhyana-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amd-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='auto-ibrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibpb-brtype'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='no-nested-data-bp'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='null-sel-clr-base'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='perfmon-v2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbpb'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='stibp-always-on'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='EPYC-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-128'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-256'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx10-512'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='prefetchiti'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Haswell-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='IvyBridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='KnightsMill-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4fmaps'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-4vnniw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512er'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512pf'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fma4'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tbm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xop'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='amx-tile'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-bf16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-fp16'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bitalg'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vbmi2'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrc'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fzrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='la57'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='taa-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='tsx-ldtrk'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='SierraForest-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ifma'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-ne-convert'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx-vnni-int8'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bhi-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='bus-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cmpccxadd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fbsdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='fsrs'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ibrs-all'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='intel-psfd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ipred-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='lam'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mcdt-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pbrsb-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='psdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rfds-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rrsba-ctrl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='serialize'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vaes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='vpclmulqdq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='hle'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='rtm'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512bw'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512cd'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512dq'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512f'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='avx512vl'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='invpcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pcid'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='pku'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='mpx'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v2'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v3'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='core-capability'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='split-lock-detect'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='Snowridge-v4'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='cldemote'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='erms'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='gfni'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdir64b'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='movdiri'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='xsaves'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='athlon-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='core2duo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='coreduo-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='n270-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='ss'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <blockers model='phenom-v1'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnow'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <feature name='3dnowext'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </blockers>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </mode>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </cpu>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <memoryBacking supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <enum name='sourceType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>anonymous</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <value>memfd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </memoryBacking>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <disk supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='diskDevice'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>disk</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cdrom</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>floppy</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>lun</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>fdc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>sata</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </disk>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <graphics supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vnc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egl-headless</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </graphics>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <video supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='modelType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vga</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>cirrus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>none</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>bochs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ramfb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </video>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hostdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='mode'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>subsystem</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='startupPolicy'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>mandatory</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>requisite</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>optional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='subsysType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pci</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>scsi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='capsType'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='pciBackend'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hostdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <rng supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtio-non-transitional</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>random</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>egd</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </rng>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <filesystem supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='driverType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>path</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>handle</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>virtiofs</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </filesystem>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tpm supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-tis</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tpm-crb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emulator</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>external</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendVersion'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>2.0</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </tpm>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <redirdev supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='bus'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>usb</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </redirdev>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <channel supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </channel>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <crypto supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendModel'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>builtin</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </crypto>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <interface supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='backendType'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>default</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>passt</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </interface>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <panic supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='model'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>isa</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>hyperv</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </panic>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <console supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='type'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>null</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vc</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pty</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dev</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>file</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>pipe</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stdio</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>udp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tcp</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>unix</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>qemu-vdagent</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>dbus</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </console>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </devices>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   <features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <gic supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <vmcoreinfo supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <genid supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backingStoreInput supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <backup supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <async-teardown supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <s390-pv supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <ps2 supported='yes'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <tdx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sev supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <sgx supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <hyperv supported='yes'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <enum name='features'>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>relaxed</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vapic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>spinlocks</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vpindex</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>runtime</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>synic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>stimer</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reset</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>vendor_id</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>frequencies</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>reenlightenment</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>tlbflush</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>ipi</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>avic</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>emsr_bitmap</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <value>xmm_input</value>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </enum>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       <defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <spinlocks>4095</spinlocks>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <stimer_direct>on</stimer_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:       </defaults>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     </hyperv>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:     <launchSecurity supported='no'/>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:   </features>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: </domainCapabilities>
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.472 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.472 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.476 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.476 231725 INFO nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Secure Boot support detected
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.479 231725 INFO nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.479 231725 INFO nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.495 231725 DEBUG nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.525 231725 INFO nova.virt.node [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.537 231725 DEBUG nova.compute.manager [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 23 09:27:53 np0005626463.localdomain sshd[212931]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:27:53 np0005626463.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Feb 23 09:27:53 np0005626463.localdomain systemd[1]: session-53.scope: Consumed 1min 38.792s CPU time.
Feb 23 09:27:53 np0005626463.localdomain systemd-logind[759]: Session 53 logged out. Waiting for processes to exit.
Feb 23 09:27:53 np0005626463.localdomain systemd-logind[759]: Removed session 53.
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.678 231725 DEBUG nova.compute.manager [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.685 231725 DEBUG nova.virt.libvirt.vif [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005626463.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.685 231725 DEBUG nova.network.os_vif_util [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.687 231725 DEBUG nova.network.os_vif_util [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.688 231725 DEBUG os_vif [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.754 231725 DEBUG ovsdbapp.backend.ovs_idl [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.754 231725 DEBUG ovsdbapp.backend.ovs_idl [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.754 231725 DEBUG ovsdbapp.backend.ovs_idl [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.755 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.755 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.755 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.756 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.757 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.760 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.776 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.776 231725 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.776 231725 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:27:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:53.777 231725 INFO oslo.privsep.daemon [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp47ip6z1r/privsep.sock']
Feb 23 09:27:53 np0005626463.localdomain rsyslogd[758]: imjournal from <localhost:nova_compute>: begin to drop messages due to rate-limiting
Feb 23 09:27:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2221 DF PROTO=TCP SPT=51988 DPT=9102 SEQ=3794212809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2FFF00000000001030307) 
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.401 231725 INFO oslo.privsep.daemon [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.286 231967 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.291 231967 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.294 231967 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.294 231967 INFO oslo.privsep.daemon [-] privsep daemon running as pid 231967
Feb 23 09:27:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29412 DF PROTO=TCP SPT=39054 DPT=9882 SEQ=3387491264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF302070000000001030307) 
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.655 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.656 231725 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.657 231725 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.658 231725 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.658 231725 INFO os_vif [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.659 231725 DEBUG nova.compute.manager [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.663 231725 DEBUG nova.compute.manager [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.663 231725 INFO nova.compute.manager [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.765 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.765 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.765 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.766 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:27:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:54.767 231725 DEBUG oslo_concurrency.processutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.264 231725 DEBUG oslo_concurrency.processutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.341 231725 DEBUG nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.341 231725 DEBUG nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.547 231725 WARNING nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.549 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12929MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.549 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.549 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.731 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.732 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.732 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.748 231725 DEBUG nova.scheduler.client.report [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.822 231725 DEBUG nova.scheduler.client.report [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.823 231725 DEBUG nova.compute.provider_tree [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.841 231725 DEBUG nova.scheduler.client.report [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.867 231725 DEBUG nova.scheduler.client.report [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:27:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:55.924 231725 DEBUG oslo_concurrency.processutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.385 231725 DEBUG oslo_concurrency.processutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.392 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.393 231725 INFO nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] kernel doesn't support AMD SEV
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.394 231725 DEBUG nova.compute.provider_tree [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.395 231725 DEBUG nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.596 231725 DEBUG nova.scheduler.client.report [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.626 231725 DEBUG nova.compute.resource_tracker [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.627 231725 DEBUG oslo_concurrency.lockutils [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.627 231725 DEBUG nova.service [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.663 231725 DEBUG nova.service [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 09:27:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:56.664 231725 DEBUG nova.servicegroup.drivers.db [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = <Service: host=np0005626463.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 09:27:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2223 DF PROTO=TCP SPT=51988 DPT=9102 SEQ=3794212809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF30C060000000001030307) 
Feb 23 09:27:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:57.501 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:27:57 np0005626463.localdomain podman[232015]: 2026-02-23 09:27:57.915214934 +0000 UTC m=+0.088240297 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 09:27:57 np0005626463.localdomain podman[232015]: 2026-02-23 09:27:57.94824482 +0000 UTC m=+0.121270163 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:27:57 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:27:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:27:58.789 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:27:58 np0005626463.localdomain sshd[232034]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:27:59 np0005626463.localdomain sshd[232034]: Accepted publickey for zuul from 192.168.122.30 port 46748 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:27:59 np0005626463.localdomain systemd-logind[759]: New session 55 of user zuul.
Feb 23 09:27:59 np0005626463.localdomain systemd[1]: Started Session 55 of User zuul.
Feb 23 09:27:59 np0005626463.localdomain sshd[232034]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:27:59 np0005626463.localdomain python3.9[232145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:28:00 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:28:00Z|00048|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:28:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:00.113 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44208 DF PROTO=TCP SPT=38014 DPT=9100 SEQ=2101912107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF31A860000000001030307) 
Feb 23 09:28:00 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:28:00Z|00049|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:28:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:00.962 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:01.842 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:28:01Z|00050|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:28:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:02.503 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:02 np0005626463.localdomain sudo[232258]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upduqkstftocchujdbutjypuhdwaijli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838881.711256-63-165291041329529/AnsiballZ_systemd_service.py
Feb 23 09:28:02 np0005626463.localdomain sudo[232258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44209 DF PROTO=TCP SPT=38014 DPT=9100 SEQ=2101912107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF322860000000001030307) 
Feb 23 09:28:03 np0005626463.localdomain python3.9[232260]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:28:03 np0005626463.localdomain systemd-rc-local-generator[232282]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:28:03 np0005626463.localdomain systemd-sysv-generator[232286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:03 np0005626463.localdomain sudo[232258]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:03.823 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:04 np0005626463.localdomain python3.9[232403]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:28:04 np0005626463.localdomain network[232420]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:28:04 np0005626463.localdomain network[232421]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:28:04 np0005626463.localdomain network[232422]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:28:05 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48630 DF PROTO=TCP SPT=55754 DPT=9101 SEQ=351446676 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF32F460000000001030307) 
Feb 23 09:28:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:07.530 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:08 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:08.873 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=596 DF PROTO=TCP SPT=40388 DPT=9882 SEQ=4204293524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF33BA20000000001030307) 
Feb 23 09:28:09 np0005626463.localdomain sudo[232653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smekvskezqncqrexmndnmdceorzegngh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838889.1583195-120-49972452145051/AnsiballZ_systemd_service.py
Feb 23 09:28:09 np0005626463.localdomain sudo[232653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:09 np0005626463.localdomain python3.9[232655]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:28:09 np0005626463.localdomain sudo[232653]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:10 np0005626463.localdomain sudo[232764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvfzfenaztqexwllnmjzffbdnehqqpmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838890.1274793-150-28339990179991/AnsiballZ_file.py
Feb 23 09:28:10 np0005626463.localdomain sudo[232764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:10 np0005626463.localdomain python3.9[232766]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:10 np0005626463.localdomain sudo[232764]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:10 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Feb 23 09:28:10 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:28:10 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:28:10 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:28:11 np0005626463.localdomain sudo[232875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szzegjzjfeszggxjpatujohsxacfdgag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838890.9561317-174-241799943368636/AnsiballZ_file.py
Feb 23 09:28:11 np0005626463.localdomain sudo[232875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:11 np0005626463.localdomain sshd[232878]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:28:11 np0005626463.localdomain python3.9[232877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:11 np0005626463.localdomain sudo[232875]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:11 np0005626463.localdomain sshd[232878]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:28:12 np0005626463.localdomain sudo[232987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqacbtwjnxvytkpoctbhmnxwsbvvaeni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838891.7116032-201-168563310307294/AnsiballZ_command.py
Feb 23 09:28:12 np0005626463.localdomain sudo[232987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:12 np0005626463.localdomain python3.9[232989]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:28:12 np0005626463.localdomain sudo[232987]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=598 DF PROTO=TCP SPT=40388 DPT=9882 SEQ=4204293524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF347C60000000001030307) 
Feb 23 09:28:12 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:12.554 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:13 np0005626463.localdomain python3.9[233099]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:28:13 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:13.914 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:13 np0005626463.localdomain sudo[233207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeuyoxdydzybhpzcxnvcuskqagtleefc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838893.6397233-255-36656168994146/AnsiballZ_systemd_service.py
Feb 23 09:28:13 np0005626463.localdomain sudo[233207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:14 np0005626463.localdomain sshd[233210]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:28:14 np0005626463.localdomain python3.9[233209]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:28:14 np0005626463.localdomain systemd-rc-local-generator[233238]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:28:14 np0005626463.localdomain systemd-sysv-generator[233241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:14 np0005626463.localdomain sudo[233207]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:14 np0005626463.localdomain sshd[233210]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:28:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15276 DF PROTO=TCP SPT=44880 DPT=9105 SEQ=1569690799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF352060000000001030307) 
Feb 23 09:28:15 np0005626463.localdomain sudo[233356]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfklvduriadsjndxwiybatwvsovzllwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838895.012153-279-71400684130251/AnsiballZ_command.py
Feb 23 09:28:15 np0005626463.localdomain sudo[233356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:15 np0005626463.localdomain python3.9[233358]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:28:15 np0005626463.localdomain sudo[233356]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:17 np0005626463.localdomain sudo[233468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebwvvmczutfvxqyxcxhogpqjudsdryhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838896.7456152-306-138738687850819/AnsiballZ_file.py
Feb 23 09:28:17 np0005626463.localdomain sudo[233468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:17 np0005626463.localdomain python3.9[233470]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:17 np0005626463.localdomain sudo[233468]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.594 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.666 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.691 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.691 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.691 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.692 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:17.757 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:28:18 np0005626463.localdomain python3.9[233578]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:18 np0005626463.localdomain sudo[233688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehcllihflvxjjwbdgllnybdrycugobyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838898.2145486-354-262697721777828/AnsiballZ_group.py
Feb 23 09:28:18 np0005626463.localdomain sudo[233688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48632 DF PROTO=TCP SPT=55754 DPT=9101 SEQ=351446676 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF360060000000001030307) 
Feb 23 09:28:18 np0005626463.localdomain python3.9[233690]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 23 09:28:18 np0005626463.localdomain sudo[233688]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:18.952 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:19 np0005626463.localdomain sudo[233798]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voapbpjijwqruvtnplwexgobqxwahaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838899.2437282-387-140021232818125/AnsiballZ_getent.py
Feb 23 09:28:19 np0005626463.localdomain sudo[233798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:28:19 np0005626463.localdomain podman[233801]: 2026-02-23 09:28:19.780900592 +0000 UTC m=+0.088610490 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:28:19 np0005626463.localdomain podman[233801]: 2026-02-23 09:28:19.851267995 +0000 UTC m=+0.158977863 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:28:19 np0005626463.localdomain python3.9[233800]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 23 09:28:19 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:28:19 np0005626463.localdomain sudo[233798]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:20 np0005626463.localdomain sudo[233933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydyllzyfdhwofmuawovtqbhzzuxvttnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838900.0256484-411-130409917740144/AnsiballZ_group.py
Feb 23 09:28:20 np0005626463.localdomain sudo[233933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:20 np0005626463.localdomain python3.9[233935]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 23 09:28:20 np0005626463.localdomain groupadd[233936]: group added to /etc/group: name=ceilometer, GID=42405
Feb 23 09:28:20 np0005626463.localdomain groupadd[233936]: group added to /etc/gshadow: name=ceilometer
Feb 23 09:28:20 np0005626463.localdomain groupadd[233936]: new group: name=ceilometer, GID=42405
Feb 23 09:28:20 np0005626463.localdomain sudo[233933]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:21 np0005626463.localdomain sudo[234049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeuqpnnxmstbqobgeqqtdlcwgderjhpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838900.7873752-435-53631910692736/AnsiballZ_user.py
Feb 23 09:28:21 np0005626463.localdomain sudo[234049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:21 np0005626463.localdomain python3.9[234051]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 23 09:28:21 np0005626463.localdomain useradd[234053]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 23 09:28:21 np0005626463.localdomain useradd[234053]: add 'ceilometer' to group 'libvirt'
Feb 23 09:28:21 np0005626463.localdomain useradd[234053]: add 'ceilometer' to shadow group 'libvirt'
Feb 23 09:28:21 np0005626463.localdomain sudo[234049]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:22 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:22.623 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:22 np0005626463.localdomain python3.9[234167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:23 np0005626463.localdomain python3.9[234253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771838902.4156823-513-112485582236387/.source.conf _original_basename=ceilometer.conf follow=False checksum=4b0a838cd69b15ea29a51dcd9d2e92127205926b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:23.984 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:24 np0005626463.localdomain python3.9[234361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48856 DF PROTO=TCP SPT=50904 DPT=9102 SEQ=884516032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF375200000000001030307) 
Feb 23 09:28:24 np0005626463.localdomain python3.9[234447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771838903.6621501-513-26662183082544/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=600 DF PROTO=TCP SPT=40388 DPT=9882 SEQ=4204293524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF378060000000001030307) 
Feb 23 09:28:25 np0005626463.localdomain python3.9[234555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:25 np0005626463.localdomain python3.9[234641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771838904.7053463-513-37826051991779/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:26 np0005626463.localdomain python3.9[234749]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:26 np0005626463.localdomain python3.9[234857]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48858 DF PROTO=TCP SPT=50904 DPT=9102 SEQ=884516032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF381470000000001030307) 
Feb 23 09:28:27 np0005626463.localdomain python3.9[234965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:27 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:27.666 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:27 np0005626463.localdomain python3.9[235051]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838907.0256228-690-77024440632349/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=1b486c8889fd20026a215b81ea19419a850aff23 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:28 np0005626463.localdomain sudo[235129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:28:28 np0005626463.localdomain sudo[235129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:28:28 np0005626463.localdomain sudo[235129]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:28:28 np0005626463.localdomain sudo[235184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:28:28 np0005626463.localdomain sudo[235184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:28:28 np0005626463.localdomain systemd[1]: tmp-crun.bxfv8m.mount: Deactivated successfully.
Feb 23 09:28:28 np0005626463.localdomain podman[235178]: 2026-02-23 09:28:28.669703974 +0000 UTC m=+0.109016719 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 23 09:28:28 np0005626463.localdomain podman[235178]: 2026-02-23 09:28:28.704275217 +0000 UTC m=+0.143587942 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 23 09:28:28 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:28:28 np0005626463.localdomain python3.9[235176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:29 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:29.027 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:29 np0005626463.localdomain python3.9[235315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838908.2792506-690-48172773206248/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=e2858327749c09c7b8ca5fc97985d7885b95bd4b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:29 np0005626463.localdomain sudo[235184]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:29 np0005626463.localdomain python3.9[235440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:29 np0005626463.localdomain sudo[235441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:28:29 np0005626463.localdomain sudo[235441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:28:29 np0005626463.localdomain sudo[235441]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:30 np0005626463.localdomain python3.9[235544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838909.4354804-777-48419553079185/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41126 DF PROTO=TCP SPT=46792 DPT=9100 SEQ=2175295943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF38F860000000001030307) 
Feb 23 09:28:31 np0005626463.localdomain python3.9[235652]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:32 np0005626463.localdomain python3.9[235762]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:32 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:32.709 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:32 np0005626463.localdomain python3.9[235870]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41127 DF PROTO=TCP SPT=46792 DPT=9100 SEQ=2175295943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF397860000000001030307) 
Feb 23 09:28:33 np0005626463.localdomain sudo[235978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkhsbfgoxxydyqnoevmirpyflitdbxke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838912.8782892-903-206554696027649/AnsiballZ_file.py
Feb 23 09:28:33 np0005626463.localdomain sudo[235978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:33 np0005626463.localdomain python3.9[235980]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:33 np0005626463.localdomain sudo[235978]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:34 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:34.068 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:34 np0005626463.localdomain sudo[236088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvhcqpdmbnqjrmggexmgnnlgkajsissy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838913.945548-927-280051418548183/AnsiballZ_systemd_service.py
Feb 23 09:28:34 np0005626463.localdomain sudo[236088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:34 np0005626463.localdomain python3.9[236090]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:28:34 np0005626463.localdomain systemd-sysv-generator[236118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:28:34 np0005626463.localdomain systemd-rc-local-generator[236115]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:34 np0005626463.localdomain systemd[1]: Listening on Podman API Socket.
Feb 23 09:28:34 np0005626463.localdomain sudo[236088]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37542 DF PROTO=TCP SPT=52326 DPT=9101 SEQ=852903233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3A4870000000001030307) 
Feb 23 09:28:36 np0005626463.localdomain sudo[236238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbbmkwdcauspasygjnwfrjapdnsrjnjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/AnsiballZ_stat.py
Feb 23 09:28:36 np0005626463.localdomain sudo[236238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:36 np0005626463.localdomain python3.9[236240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:36 np0005626463.localdomain sudo[236238]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:37 np0005626463.localdomain sudo[236326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odxpiicbzyosqiowvarruylhunjfjrbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/AnsiballZ_copy.py
Feb 23 09:28:37 np0005626463.localdomain sudo[236326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:37 np0005626463.localdomain python3.9[236328]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:37 np0005626463.localdomain sudo[236326]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:37 np0005626463.localdomain sudo[236381]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gztbhayelkjgonbqpqezmpkcyitdgdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/AnsiballZ_stat.py
Feb 23 09:28:37 np0005626463.localdomain sudo[236381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:37 np0005626463.localdomain python3.9[236383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:37 np0005626463.localdomain sudo[236381]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:37 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:37.744 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:38 np0005626463.localdomain sudo[236469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfdwmbpehpmvtejqfvecgjvjukhiadfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/AnsiballZ_copy.py
Feb 23 09:28:38 np0005626463.localdomain sudo[236469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:38 np0005626463.localdomain python3.9[236471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838916.2092702-954-139400533907952/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:38 np0005626463.localdomain sudo[236469]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:39 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:39.113 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45652 DF PROTO=TCP SPT=52602 DPT=9882 SEQ=2966039936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3B0D30000000001030307) 
Feb 23 09:28:39 np0005626463.localdomain sudo[236579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzbwvzuccepjdudljdljdrlzybqcxhbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838919.2156916-1050-252795328586353/AnsiballZ_file.py
Feb 23 09:28:39 np0005626463.localdomain sudo[236579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:39 np0005626463.localdomain python3.9[236581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:39 np0005626463.localdomain sudo[236579]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:40 np0005626463.localdomain sudo[236689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwxloowkywviidavepwmwdqlsqawfrgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838919.8580852-1074-112318655936271/AnsiballZ_file.py
Feb 23 09:28:40 np0005626463.localdomain sudo[236689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:40 np0005626463.localdomain python3.9[236691]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:40 np0005626463.localdomain sudo[236689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:40 np0005626463.localdomain sudo[236799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btdelamljmoymmkuwekvbitsbdqzkxkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838920.571894-1098-93525735856434/AnsiballZ_stat.py
Feb 23 09:28:40 np0005626463.localdomain sudo[236799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:41 np0005626463.localdomain python3.9[236801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:41 np0005626463.localdomain sudo[236799]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:41 np0005626463.localdomain sudo[236889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpsucresbiidvwafsvgewovgbhwyovxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838920.571894-1098-93525735856434/AnsiballZ_copy.py
Feb 23 09:28:41 np0005626463.localdomain sudo[236889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:41 np0005626463.localdomain python3.9[236891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838920.571894-1098-93525735856434/.source.json _original_basename=.o2l7xxkb follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:41 np0005626463.localdomain sudo[236889]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45654 DF PROTO=TCP SPT=52602 DPT=9882 SEQ=2966039936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3BCC60000000001030307) 
Feb 23 09:28:42 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:42.778 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:43 np0005626463.localdomain python3.9[236999]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:44 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:44.156 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45910 DF PROTO=TCP SPT=38372 DPT=9105 SEQ=2005889756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3C8070000000001030307) 
Feb 23 09:28:45 np0005626463.localdomain sudo[237301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcrvdiwlbmwhbypwkdkcptmgaokyekka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838925.1199503-1218-274881871649987/AnsiballZ_container_config_data.py
Feb 23 09:28:45 np0005626463.localdomain sudo[237301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:45 np0005626463.localdomain python3.9[237303]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 23 09:28:45 np0005626463.localdomain sudo[237301]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:46 np0005626463.localdomain sudo[237411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywatlysrztxfjttoizxvevtrwliotqvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838926.156188-1251-143597581756786/AnsiballZ_container_config_hash.py
Feb 23 09:28:46 np0005626463.localdomain sudo[237411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:46 np0005626463.localdomain python3.9[237413]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:28:46 np0005626463.localdomain sudo[237411]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:47 np0005626463.localdomain sudo[237521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlggosemntjkjxiarzdifzbyhcbsywlv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838927.1266956-1281-227045184013362/AnsiballZ_edpm_container_manage.py
Feb 23 09:28:47 np0005626463.localdomain sudo[237521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:47 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:47.816 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:47 np0005626463.localdomain python3[237523]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:28:48 np0005626463.localdomain python3[237523]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "06e96a8544ce5b1764a2938311eff3a4e150b0db6e81ca441c51cfb1ef2d06f7",
                                                                    "Digest": "sha256:6b9e3ee61e70553173f5cfad8288b7db7633aba0bb89dd3706a98e001e847744",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:6b9e3ee61e70553173f5cfad8288b7db7633aba0bb89dd3706a98e001e847744"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:19:44.775576019Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 507008258,
                                                                    "VirtualSize": 507008258,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",
                                                                              "sha256:e987eedfc47d5bbc741f79f8e7da7344301a57efb72290ce8f9e21557f91af78",
                                                                              "sha256:aa35e8d9bc7b5398fbfd3575b78fe9a16122508eef3848f5956f0550eed3f37b"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:21.746093163Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:58.628150825Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:01.105956567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:13:29.248957086Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:13:30.166975367Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:14:33.291090294Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:14:38.958489474Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:19:09.32005903Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:19:44.773972611Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:19:46.486024786Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 23 09:28:48 np0005626463.localdomain podman[237573]: 2026-02-23 09:28:48.214415074 +0000 UTC m=+0.096045581 container remove 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 09:28:48 np0005626463.localdomain python3[237523]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Feb 23 09:28:48 np0005626463.localdomain podman[237588]: 
Feb 23 09:28:48 np0005626463.localdomain podman[237588]: 2026-02-23 09:28:48.322229767 +0000 UTC m=+0.087152846 container create be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:28:48 np0005626463.localdomain podman[237588]: 2026-02-23 09:28:48.281239501 +0000 UTC m=+0.046162630 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 23 09:28:48 np0005626463.localdomain python3[237523]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Feb 23 09:28:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37544 DF PROTO=TCP SPT=52326 DPT=9101 SEQ=852903233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3D4060000000001030307) 
Feb 23 09:28:48 np0005626463.localdomain sudo[237521]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:28:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:28:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:28:48.528 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:28:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:28:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:28:48 np0005626463.localdomain sudo[237733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaaqcahmkyqaqwimehljbvwgojnieegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838928.6900811-1305-223231118684553/AnsiballZ_stat.py
Feb 23 09:28:48 np0005626463.localdomain sudo[237733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:49 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:49.158 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:49 np0005626463.localdomain python3.9[237735]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:49 np0005626463.localdomain sudo[237733]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:49 np0005626463.localdomain sshd[237756]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:28:49 np0005626463.localdomain sudo[237847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssaixcktutujbhnvxktlvuncgcebjzob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838929.5452757-1332-112063455879843/AnsiballZ_file.py
Feb 23 09:28:49 np0005626463.localdomain sudo[237847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:49 np0005626463.localdomain sshd[237850]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:28:49 np0005626463.localdomain sshd[237756]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:28:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:28:50 np0005626463.localdomain python3.9[237849]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:50 np0005626463.localdomain sudo[237847]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:50 np0005626463.localdomain systemd[1]: tmp-crun.jrKssa.mount: Deactivated successfully.
Feb 23 09:28:50 np0005626463.localdomain podman[237852]: 2026-02-23 09:28:50.100852886 +0000 UTC m=+0.111456533 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 09:28:50 np0005626463.localdomain sshd[237850]: Invalid user  from 129.146.81.203 port 34954
Feb 23 09:28:50 np0005626463.localdomain podman[237852]: 2026-02-23 09:28:50.171910679 +0000 UTC m=+0.182514276 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:28:50 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:28:50 np0005626463.localdomain sudo[237929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmoueaatqhhizsbctdplupdygalrytur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838929.5452757-1332-112063455879843/AnsiballZ_stat.py
Feb 23 09:28:50 np0005626463.localdomain sudo[237929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:50 np0005626463.localdomain python3.9[237931]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:28:50 np0005626463.localdomain sudo[237929]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:50 np0005626463.localdomain sshd[238002]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:28:50 np0005626463.localdomain sudo[238039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpuasdzlwadwxyhcptgihcejqekcnbcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838930.5341384-1332-104463590051381/AnsiballZ_copy.py
Feb 23 09:28:50 np0005626463.localdomain sudo[238039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:51 np0005626463.localdomain python3.9[238041]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838930.5341384-1332-104463590051381/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:51 np0005626463.localdomain sudo[238039]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:51 np0005626463.localdomain sudo[238095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vswxpqzxulrptxbclsfjrieistkrvnus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838930.5341384-1332-104463590051381/AnsiballZ_systemd.py
Feb 23 09:28:51 np0005626463.localdomain sudo[238095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:51 np0005626463.localdomain sshd[238002]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:28:51 np0005626463.localdomain python3.9[238097]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:28:52 np0005626463.localdomain systemd-rc-local-generator[238119]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:28:52 np0005626463.localdomain systemd-sysv-generator[238125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:52 np0005626463.localdomain sudo[238095]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:52.567 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:52.568 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:52.568 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:28:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:52.569 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:28:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:52.847 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:53 np0005626463.localdomain sudo[238186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxhqahzyzvlunlilgqfzyxsqttaszgns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838930.5341384-1332-104463590051381/AnsiballZ_systemd.py
Feb 23 09:28:53 np0005626463.localdomain sudo[238186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:53 np0005626463.localdomain python3.9[238188]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:28:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33098 DF PROTO=TCP SPT=38972 DPT=9102 SEQ=2893062852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3EA4F0000000001030307) 
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.155 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.156 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.157 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.157 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.201 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:28:54 np0005626463.localdomain systemd-rc-local-generator[238216]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:28:54 np0005626463.localdomain systemd-sysv-generator[238221]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:28:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45656 DF PROTO=TCP SPT=52602 DPT=9882 SEQ=2966039936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3EC060000000001030307) 
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: tmp-crun.NSQ7oL.mount: Deactivated successfully.
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:28:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c605f16099a73d17612c0de493d249ffc00e6e9776a3ef56a0fc291082227a/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 23 09:28:54 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c605f16099a73d17612c0de493d249ffc00e6e9776a3ef56a0fc291082227a/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.896 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.919 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.919 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:28:54 np0005626463.localdomain podman[238229]: 2026-02-23 09:28:54.922205813 +0000 UTC m=+0.168914870 container init be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.920 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.920 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.921 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.921 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.921 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.922 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.922 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.922 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:28:54 np0005626463.localdomain ceilometer_agent_compute[238244]: + sudo -E kolla_set_configs
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.943 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.944 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.944 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.944 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:28:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:54.945 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:28:54 np0005626463.localdomain sudo[238250]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 23 09:28:54 np0005626463.localdomain ceilometer_agent_compute[238244]: sudo: unable to send audit message: Operation not permitted
Feb 23 09:28:54 np0005626463.localdomain sudo[238250]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 23 09:28:54 np0005626463.localdomain sudo[238250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 23 09:28:54 np0005626463.localdomain podman[238229]: 2026-02-23 09:28:54.958951051 +0000 UTC m=+0.205660058 container start be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 23 09:28:54 np0005626463.localdomain podman[238229]: ceilometer_agent_compute
Feb 23 09:28:54 np0005626463.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 23 09:28:55 np0005626463.localdomain sudo[238186]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Validating config file
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Copying service configuration files
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: INFO:__main__:Writing out command to execute
Feb 23 09:28:55 np0005626463.localdomain sudo[238250]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: ++ cat /run_command
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + ARGS=
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + sudo kolla_copy_cacerts
Feb 23 09:28:55 np0005626463.localdomain sudo[238277]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: sudo: unable to send audit message: Operation not permitted
Feb 23 09:28:55 np0005626463.localdomain sudo[238277]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 23 09:28:55 np0005626463.localdomain sudo[238277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 23 09:28:55 np0005626463.localdomain sudo[238277]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + [[ ! -n '' ]]
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + . kolla_extend_start
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + umask 0022
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 23 09:28:55 np0005626463.localdomain podman[238253]: 2026-02-23 09:28:55.076039131 +0000 UTC m=+0.107707771 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 09:28:55 np0005626463.localdomain podman[238253]: 2026-02-23 09:28:55.108338975 +0000 UTC m=+0.140007615 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 09:28:55 np0005626463.localdomain podman[238253]: unhealthy
Feb 23 09:28:55 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:28:55 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Failed with result 'exit-code'.
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.443 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.513 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.514 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.747 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.749 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12891MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.750 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.751 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.809 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.810 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.811 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:28:55 np0005626463.localdomain systemd[1]: tmp-crun.RWJjAi.mount: Deactivated successfully.
Feb 23 09:28:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:55.851 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.898 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.899 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.900 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.901 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.902 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.903 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.904 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.905 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.906 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.907 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.908 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.909 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.910 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.911 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.912 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.929 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.931 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 23 09:28:55 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:55.932 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 23 09:28:56 np0005626463.localdomain python3.9[238401]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.038 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.096 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.097 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.098 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.099 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.100 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.101 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.102 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.105 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.106 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.107 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.108 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.109 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.110 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.111 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.112 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.113 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.114 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.117 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.126 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 23 09:28:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:56.332 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:28:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:56.340 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:28:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:56.361 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:28:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:56.363 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:28:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:56.364 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.445 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}71b902e33c3724a255efa786304ad2a66f06dcaf173357e60edcc456b777981e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.540 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Mon, 23 Feb 2026 09:28:56 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f8d9a3ac-3f21-4bef-b3b9-ff529d2dccde x-openstack-request-id: req-f8d9a3ac-3f21-4bef-b3b9-ff529d2dccde _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.540 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "c13b1f72-534e-4f1d-8659-0e8f3a2c7d53", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.540 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-f8d9a3ac-3f21-4bef-b3b9-ff529d2dccde request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.542 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}71b902e33c3724a255efa786304ad2a66f06dcaf173357e60edcc456b777981e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.577 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Mon, 23 Feb 2026 09:28:56 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-456ee6ab-6653-4450-8dba-69fa1d7f9451 x-openstack-request-id: req-456ee6ab-6653-4450-8dba-69fa1d7f9451 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.577 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "c13b1f72-534e-4f1d-8659-0e8f3a2c7d53", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.577 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/c13b1f72-534e-4f1d-8659-0e8f3a2c7d53 used request id req-456ee6ab-6653-4450-8dba-69fa1d7f9451 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.579 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.584 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2a7d92b-952f-46a7-8a6a-3322a48fcf4b / tapa27e5011-20 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.585 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3166483f-b5ae-4869-a31c-09cfb8d39d5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.580262', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13a624fc-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '608bd9394af428f59823cf4611c86afade365068ec5acea26d57ffa6ecf412cb'}]}, 'timestamp': '2026-02-23 09:28:56.586388', '_unique_id': '14b8e8c22d5742809e0d7ac9fda605e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.592 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.596 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee1d4655-e9c3-442b-ba46-9462cb199d33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.596638', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13a7d0a4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '681ca05176c523362b2d7635c791e3b006aee9e02be8e2a498b5b636319b0f2a'}]}, 'timestamp': '2026-02-23 09:28:56.597174', '_unique_id': '29321bbe5ffa4cb9aff258f49e20df88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.598 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.642 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.643 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1164fa8a-92ee-4db0-a401-e86133a81bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.599388', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13aedf3e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': 'f2ec3b009e15af851331cb084ad2a418a418b38354083fc5a7bae5db385a06fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.599388', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13aef1d6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': 'e2fd614387d45ff82b00c8abcfc97d40269090402f621bc0ebabb4d4eafd8d30'}]}, 'timestamp': '2026-02-23 09:28:56.643856', '_unique_id': '1c61d687232e42d39894582bd57b85c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.645 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.646 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.646 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.646 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.647 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.671 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9890a5f-4dfa-4698-8cd5-a301ec761229', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:28:56.647447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '13b349f2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.860875283, 'message_signature': '75cda7424380223544d9217981b48c343ca80666bb83ebb1f2dce439286b8660'}]}, 'timestamp': '2026-02-23 09:28:56.672335', '_unique_id': '833c41c590de400f85db56ff38b73929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.673 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.674 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.675 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e5bc2e9-9c87-4ef2-bb10-31689f76cd79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.674683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b3b8ba-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '26bf72353169f5ec886517afe98614d11e3124ceb5987d0a799df4bab706b469'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.674683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b3c9a4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '341903462e1627ba0edf7da9504d23b24c538e3cf035026a123e33fce62900c6'}]}, 'timestamp': '2026-02-23 09:28:56.675582', '_unique_id': 'daf247e452eb4e8b83b984cce0c884d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.676 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.691 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.691 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ec7cc7d-57a7-44b1-b4c7-b03b9ba34518', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.677820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b6438c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': '459f8a0ef0533b70a5e73fe007e2e90fb3b87c916d51dad5c2f797cd167f3649'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.677820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b656b0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': '321b8a8c947381a21674f7f562c30a80bc0b236334a7ac3bf84b04d362c38262'}]}, 'timestamp': '2026-02-23 09:28:56.692304', '_unique_id': 'fcbd2b10d5fe40dca70e7489d4be820d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.693 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.694 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d73b9b5-600e-423a-8ec9-31013c7b8a58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.694652', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13b6c55a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '0fe3ce617d31a54ab84bc4fdd459da809ed860dd0dfda2deb22356287031b640'}]}, 'timestamp': '2026-02-23 09:28:56.695169', '_unique_id': 'd21f168f3d304d3c8287d3655b959f90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.696 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.697 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.697 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 52680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30ac61c1-2026-478d-a809-72e9e3df443e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52680000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:28:56.697662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '13b73a58-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.860875283, 'message_signature': '584974cc9e872b7c3d5fa8e2bbd8da6abe555c193a59ad32d822e03036710cc9'}]}, 'timestamp': '2026-02-23 09:28:56.698151', '_unique_id': '14683dafb81d4513bfc5f14bbc12b0b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.699 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.700 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.700 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.700 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.700 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.700 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.701 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.701 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.701 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b98f2ce0-3227-4647-8170-6c323c7f9e38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.701494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b7cedc-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '23ec3df8220c00c3d6a99f9307cf8350292a98436f1a22d6600c3e62a8389fc5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.701494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b7e17e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '250f5468640ec4c715dcf177023a9c36b44ce90642d8fde285a159ed76b82e84'}]}, 'timestamp': '2026-02-23 09:28:56.702409', '_unique_id': 'b595f8344e1f44fcaf5fe0958ab69017'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.703 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.704 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.705 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b832c26-6b58-423f-ba44-59ad331d0c72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.704610', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b84880-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': 'c56ee824a98a215554125655eb5ba92c22b0ccbc78db560d6fcf03d5429b8060'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.704610', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b85abe-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': '9169c6624ee2c91b7262cea2558a724b5a766682c4f1ad7746f73b33b0d710de'}]}, 'timestamp': '2026-02-23 09:28:56.705500', '_unique_id': 'f4f1ed5c84804cffb1cbd26c8ae73bf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.706 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.707 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.708 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '242d67a2-868b-45a4-978d-481149eb8e9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.707672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b8c206-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': '92abc52085239afe4f8feba54fb0bf64ab10131f36da5956f9a1bd6d1ba713c3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.707672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b8d296-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.86744948, 'message_signature': '83e679fad57baa180ea53c4b798c197627a85ae839658bc8095c6d446dee8751'}]}, 'timestamp': '2026-02-23 09:28:56.708568', '_unique_id': '97ae3d6b61344cedad371b64ab4c7e41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.709 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.710 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e41b3f-b3e8-4f15-aa8c-ed6222cb7093', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.710736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13b93966-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '70ce9ebf3f820d982eef188355fa5044bb466e29d04c536b5e793d3831f76c14'}]}, 'timestamp': '2026-02-23 09:28:56.711233', '_unique_id': 'a7eb01aae3ca4425a4e4c93ba3814359'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.712 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.713 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.713 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed73ba6a-697d-4f8e-a65c-062bf055e647', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.713462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13b9a220-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '5659da8ed42fcb1123802e2e24e7b39b1268605f6e68409fc83446c8c4713af3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.713462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13b9b3dc-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': 'fa26fcb8c4281312b10b7de2e06ee02cb5a60f9f193b4ea92d71ec3aa46f29b6'}]}, 'timestamp': '2026-02-23 09:28:56.714336', '_unique_id': '985968ac393840758315629c38da35c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.715 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.716 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70ae0ab-afa0-4da2-acb7-ad73769261be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.716510', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13ba1980-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '0cb1a4e661e1222a7c0b1444ccdc2e46623b71111cf0047064a06fbb13bbf513'}]}, 'timestamp': '2026-02-23 09:28:56.717008', '_unique_id': '98d23abc3d6742e6be615c60abd6a41a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.717 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.718 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.719 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.719 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b6d7c25-3133-4aef-b00d-a82c7f25a007', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.719132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13ba7fa6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '2556778c8e6f89e09ba81860e92fd00f24e99e37eb521a8cb2ed9a10962cc70c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.719132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13ba8f82-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '051db7e77cdda0b8db69c6334c1dd058bf9167eeafdf7ca0cc7061cd1e1650fa'}]}, 'timestamp': '2026-02-23 09:28:56.719991', '_unique_id': '47c530056c96479a99162deb6ba1d0a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.720 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.722 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.722 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b39a99a-9353-4618-9c1a-38bd2cd02edc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.722131', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13baf56c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '4fc3625e9ddac3d81b50246c218c375a15954672682aa68187b72ab89c45c50d'}]}, 'timestamp': '2026-02-23 09:28:56.722595', '_unique_id': '5c2166e3b28f4bfc9ccb1159578646ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.723 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.724 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78bbe880-a584-4e45-b513-bdb92074c8f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.724457', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13bb4c92-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': 'ab2a65d7e4ef6944f0e7b94da42684b3d488a4a40464b94df47936453be7233c'}]}, 'timestamp': '2026-02-23 09:28:56.724741', '_unique_id': '82a872ce6cdf415a90013e9d0cbc2a7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.725 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8ae280-41c1-4e54-a75f-888105360d53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.726071', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13bb8b8a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '9cde2a1c2398877605f6e12b5bd351bfd0b02b1bd95722f7c3409895d799dcf5'}]}, 'timestamp': '2026-02-23 09:28:56.726353', '_unique_id': 'c6fdd828b90c442ab950c13b43aa141f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.726 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.727 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.727 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.727 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.728 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.728 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49f2c215-5605-4e1f-92e2-e9377e4fa843', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:28:56.728062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13bbd964-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': '56eb32f5329eb9f1d28ae0d6b93d4841cdcaee2e06a914fc7040fe6c4ae7b810'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:28:56.728062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13bbe378-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.789014896, 'message_signature': 'a3075a5471eb4584ab0ebf3a224c71099dda3096b425673ce5ee0fc45530ddd3'}]}, 'timestamp': '2026-02-23 09:28:56.728585', '_unique_id': '980a1984a0e54391944520db90599328'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.729 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '234c1ea5-8dbb-450a-85dc-9a622cc8ce64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.729934', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13bc2298-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': '09ca23a367877ed586f223c08b3ba4d56f32db66a149c7c2626b1a7a9a95c977'}]}, 'timestamp': '2026-02-23 09:28:56.730219', '_unique_id': 'ff5bdd1c72e8442f9172b790e4a0cb1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.730 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.731 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8377d93c-cc0d-414d-a2ea-f443b62c2218', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:28:56.731537', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '13bc6104-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10376.769868374, 'message_signature': 'e55d1ac0c06a8836d65e69e5e9a3d7136929bc5a6d1a43cac85c685b70f905ff'}]}, 'timestamp': '2026-02-23 09:28:56.731817', '_unique_id': 'c88c58c9b1e84a90976342c0eb2b44c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:28:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:28:56.732 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:28:56 np0005626463.localdomain sudo[238537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmvgdfmlfcjaheingxcqcjchwkalhsta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838936.699553-1467-48248085248429/AnsiballZ_stat.py
Feb 23 09:28:56 np0005626463.localdomain sudo[238537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:57 np0005626463.localdomain python3.9[238539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33100 DF PROTO=TCP SPT=38972 DPT=9102 SEQ=2893062852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF3F6460000000001030307) 
Feb 23 09:28:57 np0005626463.localdomain sudo[238537]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:57 np0005626463.localdomain sudo[238627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inoviocizqrmzqomvqvqhtbzqxifqqlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838936.699553-1467-48248085248429/AnsiballZ_copy.py
Feb 23 09:28:57 np0005626463.localdomain sudo[238627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:57 np0005626463.localdomain python3.9[238629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838936.699553-1467-48248085248429/.source.yaml _original_basename=.l3_0fr1w follow=False checksum=63ce2475c27d5c875d97f5aae3f36400853f0360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:28:57 np0005626463.localdomain sudo[238627]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:57 np0005626463.localdomain sshd[237850]: Connection closed by invalid user  129.146.81.203 port 34954 [preauth]
Feb 23 09:28:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:57.887 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:58 np0005626463.localdomain sudo[238737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyexjuvxqzyqlgzdquivelqfncoyzdxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838937.9542227-1512-192441174760616/AnsiballZ_stat.py
Feb 23 09:28:58 np0005626463.localdomain sudo[238737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:28:58 np0005626463.localdomain podman[238740]: 2026-02-23 09:28:58.969454355 +0000 UTC m=+0.066658694 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 23 09:28:58 np0005626463.localdomain podman[238740]: 2026-02-23 09:28:58.973142855 +0000 UTC m=+0.070347194 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent)
Feb 23 09:28:58 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:28:59 np0005626463.localdomain python3.9[238739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:28:59 np0005626463.localdomain sudo[238737]: pam_unix(sudo:session): session closed for user root
Feb 23 09:28:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:28:59.203 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:28:59 np0005626463.localdomain sudo[238843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdvulrrcbepecntinvrfizpjwlwldgea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838937.9542227-1512-192441174760616/AnsiballZ_copy.py
Feb 23 09:28:59 np0005626463.localdomain sudo[238843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:28:59 np0005626463.localdomain python3.9[238845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838937.9542227-1512-192441174760616/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:28:59 np0005626463.localdomain sudo[238843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:00 np0005626463.localdomain sudo[238953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsbjmrvxcbwlhzskckplxvvcnavsnrgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838940.3234959-1575-67450347588424/AnsiballZ_file.py
Feb 23 09:29:00 np0005626463.localdomain sudo[238953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:00 np0005626463.localdomain python3.9[238955]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:00 np0005626463.localdomain sudo[238953]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44486 DF PROTO=TCP SPT=40738 DPT=9100 SEQ=714550716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF404C60000000001030307) 
Feb 23 09:29:01 np0005626463.localdomain sudo[239063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-latoevuhuidgwdrbrjwiydjixenbopiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838941.0323205-1599-202253425529676/AnsiballZ_file.py
Feb 23 09:29:01 np0005626463.localdomain sudo[239063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:29:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:29:01 np0005626463.localdomain python3.9[239065]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:29:01 np0005626463.localdomain sudo[239063]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:01 np0005626463.localdomain sudo[239173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzclhaqvjhixadhtfmkqdnecxgpvbecj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838941.7146008-1623-145898452256477/AnsiballZ_stat.py
Feb 23 09:29:01 np0005626463.localdomain sudo[239173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:02 np0005626463.localdomain python3.9[239175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:02 np0005626463.localdomain sudo[239173]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:02 np0005626463.localdomain sudo[239230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yasgwailwtcjkcgqyghxnzdkxkirigsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838941.7146008-1623-145898452256477/AnsiballZ_file.py
Feb 23 09:29:02 np0005626463.localdomain sudo[239230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:02 np0005626463.localdomain python3.9[239232]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.polrb3kl recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:02 np0005626463.localdomain sudo[239230]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44487 DF PROTO=TCP SPT=40738 DPT=9100 SEQ=714550716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF40CC60000000001030307) 
Feb 23 09:29:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:02.924 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:03 np0005626463.localdomain python3.9[239340]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:04.240 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:05 np0005626463.localdomain sudo[239642]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdzurecvsbmcmfaedmunauzchoryawsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838945.4958198-1734-185158239659259/AnsiballZ_container_config_data.py
Feb 23 09:29:05 np0005626463.localdomain sudo[239642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:05 np0005626463.localdomain python3.9[239644]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 23 09:29:05 np0005626463.localdomain sudo[239642]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24792 DF PROTO=TCP SPT=52914 DPT=9101 SEQ=3809097868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF419C70000000001030307) 
Feb 23 09:29:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:29:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:29:07 np0005626463.localdomain sudo[239752]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctnfsqfluuboisfppacxjmhopoujpyqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838946.7745955-1767-98436083638896/AnsiballZ_container_config_hash.py
Feb 23 09:29:07 np0005626463.localdomain sudo[239752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:07 np0005626463.localdomain python3.9[239754]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:29:07 np0005626463.localdomain sudo[239752]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:07 np0005626463.localdomain sudo[239862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhjbtblqrlenomsbwqlowuuzmlzyewdd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838947.6078718-1797-17496599411001/AnsiballZ_edpm_container_manage.py
Feb 23 09:29:07 np0005626463.localdomain sudo[239862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:07.960 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:08 np0005626463.localdomain python3[239864]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:29:08 np0005626463.localdomain podman[239901]: 
Feb 23 09:29:08 np0005626463.localdomain podman[239901]: 2026-02-23 09:29:08.392755252 +0000 UTC m=+0.080314651 container create bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:29:08 np0005626463.localdomain podman[239901]: 2026-02-23 09:29:08.349397117 +0000 UTC m=+0.036956546 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 23 09:29:08 np0005626463.localdomain python3[239864]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /:/rootfs:ro --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl --path.rootfs=/rootfs
Feb 23 09:29:08 np0005626463.localdomain sudo[239862]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:09 np0005626463.localdomain sudo[240047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaudewfygqmvdfeytybrpvmfylttocpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838948.8409653-1821-174614965748299/AnsiballZ_stat.py
Feb 23 09:29:09 np0005626463.localdomain sudo[240047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:09 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:09.281 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:09 np0005626463.localdomain python3.9[240049]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:29:09 np0005626463.localdomain sudo[240047]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27815 DF PROTO=TCP SPT=60706 DPT=9882 SEQ=3236042040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF426020000000001030307) 
Feb 23 09:29:09 np0005626463.localdomain sudo[240159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzegwqywxzovztsrglhcvmprtnruhzaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838949.5804758-1848-271770087052223/AnsiballZ_file.py
Feb 23 09:29:09 np0005626463.localdomain sudo[240159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:10 np0005626463.localdomain python3.9[240161]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:10 np0005626463.localdomain sudo[240159]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:10 np0005626463.localdomain sudo[240214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoxdxpzivqrvcdejaivvjcgilknxylky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838949.5804758-1848-271770087052223/AnsiballZ_stat.py
Feb 23 09:29:10 np0005626463.localdomain sudo[240214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:10 np0005626463.localdomain python3.9[240216]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:29:10 np0005626463.localdomain sudo[240214]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:11 np0005626463.localdomain sudo[240323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvxyvgkcduvtucowxozyeokwexhrvhyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838950.6267228-1848-144026878533540/AnsiballZ_copy.py
Feb 23 09:29:11 np0005626463.localdomain sudo[240323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:11 np0005626463.localdomain python3.9[240325]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838950.6267228-1848-144026878533540/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:11 np0005626463.localdomain sudo[240323]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:11 np0005626463.localdomain sudo[240378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqvknskqlmqwnswqucdcvyldreijagpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838950.6267228-1848-144026878533540/AnsiballZ_systemd.py
Feb 23 09:29:11 np0005626463.localdomain sudo[240378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:11 np0005626463.localdomain python3.9[240380]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:29:11 np0005626463.localdomain systemd-rc-local-generator[240407]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:29:11 np0005626463.localdomain systemd-sysv-generator[240411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:12 np0005626463.localdomain sudo[240378]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:12 np0005626463.localdomain sudo[240469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktoqrrsmqcyooynzskqzmqteuezttuyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838950.6267228-1848-144026878533540/AnsiballZ_systemd.py
Feb 23 09:29:12 np0005626463.localdomain sudo[240469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27817 DF PROTO=TCP SPT=60706 DPT=9882 SEQ=3236042040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF432070000000001030307) 
Feb 23 09:29:12 np0005626463.localdomain python3.9[240471]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:29:12 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:12.989 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:29:13 np0005626463.localdomain systemd-rc-local-generator[240497]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:29:13 np0005626463.localdomain systemd-sysv-generator[240501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:13 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: Starting node_exporter container...
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:29:14 np0005626463.localdomain podman[240512]: 2026-02-23 09:29:14.304361008 +0000 UTC m=+0.142589295 container init bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:29:14 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:14.331 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.344Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.344Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.344Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.344Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=node_exporter.go:117 level=info collector=arp
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=node_exporter.go:117 level=info collector=bcache
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=node_exporter.go:117 level=info collector=bonding
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.345Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=cpu
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=edac
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=filefd
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=netclass
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=netdev
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=netstat
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=nfs
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=nvme
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=softnet
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=systemd
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=xfs
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.346Z caller=node_exporter.go:117 level=info collector=zfs
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.347Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 23 09:29:14 np0005626463.localdomain node_exporter[240526]: ts=2026-02-23T09:29:14.347Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:29:14 np0005626463.localdomain podman[240512]: 2026-02-23 09:29:14.37349162 +0000 UTC m=+0.211719907 container start bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:29:14 np0005626463.localdomain podman[240512]: node_exporter
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: Started node_exporter container.
Feb 23 09:29:14 np0005626463.localdomain sudo[240469]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:14 np0005626463.localdomain podman[240536]: 2026-02-23 09:29:14.473599391 +0000 UTC m=+0.092217535 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:29:14 np0005626463.localdomain podman[240536]: 2026-02-23 09:29:14.487307092 +0000 UTC m=+0.105925246 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:29:14 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:29:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44489 DF PROTO=TCP SPT=40738 DPT=9100 SEQ=714550716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF43C060000000001030307) 
Feb 23 09:29:15 np0005626463.localdomain python3.9[240666]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:29:16 np0005626463.localdomain sudo[240774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnpifeqjfhsihmtzlpbwybdqvcauiuxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838955.941403-1983-110736278921785/AnsiballZ_stat.py
Feb 23 09:29:16 np0005626463.localdomain sudo[240774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:16 np0005626463.localdomain python3.9[240776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:16 np0005626463.localdomain sudo[240774]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:17 np0005626463.localdomain sudo[240864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrodokzppplkcahqujlyujfzjngailkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838955.941403-1983-110736278921785/AnsiballZ_copy.py
Feb 23 09:29:17 np0005626463.localdomain sudo[240864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:17 np0005626463.localdomain python3.9[240866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838955.941403-1983-110736278921785/.source.yaml _original_basename=.x1oa5z5o follow=False checksum=e950c53af4ac297c7865600773d3d75e3099f8f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:17 np0005626463.localdomain sudo[240864]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:18.019 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:18 np0005626463.localdomain sudo[240974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pytnrpjybsidkvbexcbuiullqbeqejjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838957.8187354-2028-28762546510244/AnsiballZ_stat.py
Feb 23 09:29:18 np0005626463.localdomain sudo[240974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:18 np0005626463.localdomain python3.9[240976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:18 np0005626463.localdomain sudo[240974]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24794 DF PROTO=TCP SPT=52914 DPT=9101 SEQ=3809097868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF44A060000000001030307) 
Feb 23 09:29:18 np0005626463.localdomain sudo[241062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrnpxlykphzyliblbjoilkftbntfoqwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838957.8187354-2028-28762546510244/AnsiballZ_copy.py
Feb 23 09:29:18 np0005626463.localdomain sudo[241062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:18 np0005626463.localdomain python3.9[241064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838957.8187354-2028-28762546510244/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:29:18 np0005626463.localdomain sudo[241062]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:19.378 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:20 np0005626463.localdomain sudo[241172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyteooudcefaxinyellrlgtwubzkwkym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838960.1762414-2091-141211046791721/AnsiballZ_file.py
Feb 23 09:29:20 np0005626463.localdomain sudo[241172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:29:20 np0005626463.localdomain systemd[1]: tmp-crun.s3vuPm.mount: Deactivated successfully.
Feb 23 09:29:20 np0005626463.localdomain podman[241175]: 2026-02-23 09:29:20.574159298 +0000 UTC m=+0.093615018 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:29:20 np0005626463.localdomain podman[241175]: 2026-02-23 09:29:20.61826423 +0000 UTC m=+0.137719970 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 23 09:29:20 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:29:20 np0005626463.localdomain python3.9[241174]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:20 np0005626463.localdomain sudo[241172]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:21 np0005626463.localdomain sudo[241307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyxercaakiabcbwlbuhxwjbmlluhseba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838960.8458385-2115-280474820450199/AnsiballZ_file.py
Feb 23 09:29:21 np0005626463.localdomain sudo[241307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:21 np0005626463.localdomain python3.9[241309]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:29:21 np0005626463.localdomain sudo[241307]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:21 np0005626463.localdomain sudo[241417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbhajpnxovahfcuapntpfilfmauzgidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838961.5048475-2139-4261433679126/AnsiballZ_stat.py
Feb 23 09:29:21 np0005626463.localdomain sudo[241417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:21 np0005626463.localdomain python3.9[241419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:22 np0005626463.localdomain sudo[241417]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:22 np0005626463.localdomain sudo[241474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmibdemsckujeiktfstvhvcjjxiukkhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838961.5048475-2139-4261433679126/AnsiballZ_file.py
Feb 23 09:29:22 np0005626463.localdomain sudo[241474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:22 np0005626463.localdomain python3.9[241476]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.mggvs97u recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:22 np0005626463.localdomain sudo[241474]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:23.053 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:23 np0005626463.localdomain python3.9[241584]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31750 DF PROTO=TCP SPT=51606 DPT=9102 SEQ=2276711423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF45F7F0000000001030307) 
Feb 23 09:29:24 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:24.426 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27819 DF PROTO=TCP SPT=60706 DPT=9882 SEQ=3236042040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF462060000000001030307) 
Feb 23 09:29:24 np0005626463.localdomain sshd[241842]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:29:25 np0005626463.localdomain sudo[241887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oosuavkeplsossffazyudgplzriulhxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838964.8374467-2250-144148308295757/AnsiballZ_container_config_data.py
Feb 23 09:29:25 np0005626463.localdomain sudo[241887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:29:25 np0005626463.localdomain systemd[1]: tmp-crun.Za0bFW.mount: Deactivated successfully.
Feb 23 09:29:25 np0005626463.localdomain podman[241891]: 2026-02-23 09:29:25.237334354 +0000 UTC m=+0.090029500 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:29:25 np0005626463.localdomain podman[241891]: 2026-02-23 09:29:25.27322847 +0000 UTC m=+0.125923586 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:29:25 np0005626463.localdomain podman[241891]: unhealthy
Feb 23 09:29:25 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:29:25 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Failed with result 'exit-code'.
Feb 23 09:29:25 np0005626463.localdomain python3.9[241890]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 23 09:29:25 np0005626463.localdomain sudo[241887]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:26 np0005626463.localdomain sudo[242016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehugzrhibkvqxvgwkhfrohfjwgvdvtms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838965.7902067-2283-226733114677142/AnsiballZ_container_config_hash.py
Feb 23 09:29:26 np0005626463.localdomain sudo[242016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:26 np0005626463.localdomain sshd[241842]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:29:26 np0005626463.localdomain python3.9[242018]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:29:26 np0005626463.localdomain sudo[242016]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:26 np0005626463.localdomain sudo[242126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ichkcnhtonlyzmrlpcjtjtqwaomxqgqz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838966.6818964-2313-26881818869341/AnsiballZ_edpm_container_manage.py
Feb 23 09:29:26 np0005626463.localdomain sudo[242126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31752 DF PROTO=TCP SPT=51606 DPT=9102 SEQ=2276711423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF46B860000000001030307) 
Feb 23 09:29:27 np0005626463.localdomain python3[242128]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:29:27 np0005626463.localdomain sshd[242156]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:29:28 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:28.097 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:28 np0005626463.localdomain sshd[242156]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:29:28 np0005626463.localdomain podman[242143]: 2026-02-23 09:29:27.381263647 +0000 UTC m=+0.047797964 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 23 09:29:29 np0005626463.localdomain podman[242215]: 
Feb 23 09:29:29 np0005626463.localdomain podman[242215]: 2026-02-23 09:29:29.11495443 +0000 UTC m=+0.088746431 container create da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:29:29 np0005626463.localdomain podman[242215]: 2026-02-23 09:29:29.075638611 +0000 UTC m=+0.049430612 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 23 09:29:29 np0005626463.localdomain python3[242128]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 23 09:29:29 np0005626463.localdomain sudo[242126]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:29 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:29.468 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:29:29 np0005626463.localdomain podman[242268]: 2026-02-23 09:29:29.945418596 +0000 UTC m=+0.101567975 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 09:29:29 np0005626463.localdomain podman[242268]: 2026-02-23 09:29:29.975584261 +0000 UTC m=+0.131733590 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:29:29 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:29:30 np0005626463.localdomain sudo[242348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:29:30 np0005626463.localdomain sudo[242348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:29:30 np0005626463.localdomain sudo[242348]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:30 np0005626463.localdomain sudo[242393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfnsrojbmwihentbnunvohdcutbafqju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838969.875531-2337-203023369736825/AnsiballZ_stat.py
Feb 23 09:29:30 np0005626463.localdomain sudo[242393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:30 np0005626463.localdomain sudo[242397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:29:30 np0005626463.localdomain sudo[242397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:29:30 np0005626463.localdomain python3.9[242396]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:29:30 np0005626463.localdomain sudo[242393]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:30 np0005626463.localdomain sudo[242397]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9215 DF PROTO=TCP SPT=58504 DPT=9100 SEQ=2130913447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF47A060000000001030307) 
Feb 23 09:29:30 np0005626463.localdomain sudo[242557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucergrjkctbjxpkarbrcklddcgkmkful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838970.6774035-2364-195807580846378/AnsiballZ_file.py
Feb 23 09:29:30 np0005626463.localdomain sudo[242557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:31 np0005626463.localdomain python3.9[242559]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:31 np0005626463.localdomain sudo[242557]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:31 np0005626463.localdomain sudo[242560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:29:31 np0005626463.localdomain sudo[242560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:29:31 np0005626463.localdomain sudo[242560]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:32 np0005626463.localdomain sudo[242630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdekkdyujwfbhutdswxlkrqlvlzxhvdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838970.6774035-2364-195807580846378/AnsiballZ_stat.py
Feb 23 09:29:32 np0005626463.localdomain sudo[242630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:32 np0005626463.localdomain python3.9[242632]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:29:32 np0005626463.localdomain sudo[242630]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:32 np0005626463.localdomain sudo[242739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odjrhbwssbgiibzqyzqgfocwszkxbunf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838972.2414804-2364-64432907261382/AnsiballZ_copy.py
Feb 23 09:29:32 np0005626463.localdomain sudo[242739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:32 np0005626463.localdomain python3.9[242741]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838972.2414804-2364-64432907261382/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:32 np0005626463.localdomain sudo[242739]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9216 DF PROTO=TCP SPT=58504 DPT=9100 SEQ=2130913447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF482070000000001030307) 
Feb 23 09:29:33 np0005626463.localdomain sudo[242794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhzrrhwyhvllrpygmyatodezwbcvbemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838972.2414804-2364-64432907261382/AnsiballZ_systemd.py
Feb 23 09:29:33 np0005626463.localdomain sudo[242794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:33 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:33.128 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:33 np0005626463.localdomain python3.9[242796]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:29:33 np0005626463.localdomain systemd-sysv-generator[242824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:29:33 np0005626463.localdomain systemd-rc-local-generator[242821]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:33 np0005626463.localdomain sudo[242794]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:33 np0005626463.localdomain sudo[242885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frfxzhrikwdiboszgoaypcxwaqkxtgaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838972.2414804-2364-64432907261382/AnsiballZ_systemd.py
Feb 23 09:29:33 np0005626463.localdomain sudo[242885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:34 np0005626463.localdomain python3.9[242887]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:29:34 np0005626463.localdomain systemd-sysv-generator[242915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:29:34 np0005626463.localdomain systemd-rc-local-generator[242912]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:29:34 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:34.524 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Starting podman_exporter container...
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: tmp-crun.QqyHSr.mount: Deactivated successfully.
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:29:34 np0005626463.localdomain podman[242928]: 2026-02-23 09:29:34.913596555 +0000 UTC m=+0.151260235 container init da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:29:34 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:29:34.929Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 23 09:29:34 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:29:34.929Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 23 09:29:34 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:29:34.929Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 23 09:29:34 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:29:34.929Z caller=handler.go:105 level=info collector=container
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Starting Podman API Service...
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Started Podman API Service.
Feb 23 09:29:34 np0005626463.localdomain podman[242928]: 2026-02-23 09:29:34.951593665 +0000 UTC m=+0.189257315 container start da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:29:34 np0005626463.localdomain podman[242928]: podman_exporter
Feb 23 09:29:34 np0005626463.localdomain systemd[1]: Started podman_exporter container.
Feb 23 09:29:34 np0005626463.localdomain sudo[242885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="Setting parallel job count to 25"
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:29:35 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 23 09:29:35 np0005626463.localdomain podman[242954]: time="2026-02-23T09:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:29:35 np0005626463.localdomain podman[242953]: 2026-02-23 09:29:35.045873832 +0000 UTC m=+0.090571267 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:29:35 np0005626463.localdomain podman[242953]: 2026-02-23 09:29:35.060243742 +0000 UTC m=+0.104940967 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:29:35 np0005626463.localdomain podman[242953]: unhealthy
Feb 23 09:29:35 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:29:35 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:29:35 np0005626463.localdomain python3.9[243097]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:29:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30019 DF PROTO=TCP SPT=56858 DPT=9101 SEQ=385358694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF48EC60000000001030307) 
Feb 23 09:29:36 np0005626463.localdomain sudo[243205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aedsehwvafcyetecbpjuynctjujdruqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838976.613145-2499-28458553918831/AnsiballZ_stat.py
Feb 23 09:29:36 np0005626463.localdomain sudo[243205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:37 np0005626463.localdomain python3.9[243207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:37 np0005626463.localdomain sudo[243205]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:29:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:29:37 np0005626463.localdomain sudo[243295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abnluokjahequrmoqdwfjjmxstvpptko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838976.613145-2499-28458553918831/AnsiballZ_copy.py
Feb 23 09:29:37 np0005626463.localdomain sudo[243295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:37 np0005626463.localdomain python3.9[243297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838976.613145-2499-28458553918831/.source.yaml _original_basename=.0ipudr_h follow=False checksum=0ca1b20c04b18e79edff8267384a35d09f9eb392 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:29:37 np0005626463.localdomain sudo[243295]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:38 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:38.167 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:38 np0005626463.localdomain sudo[243405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yksljpxxdedivexueriuuiumasdlkhpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838977.9560502-2544-51284212069796/AnsiballZ_stat.py
Feb 23 09:29:38 np0005626463.localdomain sudo[243405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:38 np0005626463.localdomain python3.9[243407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:38 np0005626463.localdomain sudo[243405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:38 np0005626463.localdomain sudo[243493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grzonzdipqexewgdbkbjovhyekibzuyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838977.9560502-2544-51284212069796/AnsiballZ_copy.py
Feb 23 09:29:38 np0005626463.localdomain sudo[243493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:38 np0005626463.localdomain python3.9[243495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838977.9560502-2544-51284212069796/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:29:38 np0005626463.localdomain sudo[243493]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:29:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60158 DF PROTO=TCP SPT=52436 DPT=9882 SEQ=812001102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF49B320000000001030307) 
Feb 23 09:29:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:29:39 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:39.566 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:29:40 np0005626463.localdomain sudo[243603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyvjybxxoprvozoxhskrgcgofooorkhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838980.1557324-2607-153087725082806/AnsiballZ_file.py
Feb 23 09:29:40 np0005626463.localdomain sudo[243603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:40 np0005626463.localdomain python3.9[243605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:40 np0005626463.localdomain sudo[243603]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:29:40 np0005626463.localdomain auditd[725]: Audit daemon rotating log files
Feb 23 09:29:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:29:41 np0005626463.localdomain sudo[243713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apaegivfjxyghrlkuqwkcldjeeetrtgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838980.842372-2631-120984387831398/AnsiballZ_file.py
Feb 23 09:29:41 np0005626463.localdomain sudo[243713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:41 np0005626463.localdomain python3.9[243715]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:29:41 np0005626463.localdomain sudo[243713]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:29:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:42 np0005626463.localdomain sudo[243823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndrucdnnznzulkimgjjrljeppubccuhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838981.5166972-2655-61928556765979/AnsiballZ_stat.py
Feb 23 09:29:42 np0005626463.localdomain sudo[243823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60160 DF PROTO=TCP SPT=52436 DPT=9882 SEQ=812001102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4A7470000000001030307) 
Feb 23 09:29:42 np0005626463.localdomain python3.9[243825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:29:42 np0005626463.localdomain sudo[243823]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:42 np0005626463.localdomain sudo[243880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctdocroemfltxptmdvocfychvnhlhrjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838981.5166972-2655-61928556765979/AnsiballZ_file.py
Feb 23 09:29:42 np0005626463.localdomain sudo[243880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:43 np0005626463.localdomain python3.9[243882]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.ncnn_4ka recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:43 np0005626463.localdomain sudo[243880]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:43 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:43.205 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:43 np0005626463.localdomain python3.9[243990]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:29:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:29:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4752b195a0319d00ad3a4bd86f4312afcec268e914950a9934c95f1e8044f1fa-merged.mount: Deactivated successfully.
Feb 23 09:29:44 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:44.604 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:29:44 np0005626463.localdomain podman[244176]: 2026-02-23 09:29:44.699729381 +0000 UTC m=+0.068062442 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:29:44 np0005626463.localdomain podman[244176]: 2026-02-23 09:29:44.732295267 +0000 UTC m=+0.100628288 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:29:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9218 DF PROTO=TCP SPT=58504 DPT=9100 SEQ=2130913447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4B2060000000001030307) 
Feb 23 09:29:45 np0005626463.localdomain sudo[244314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkhywagdslxoiihyfiqxpnoislhfwkqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838985.4038408-2766-78954537585634/AnsiballZ_container_config_data.py
Feb 23 09:29:45 np0005626463.localdomain sudo[244314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:45 np0005626463.localdomain python3.9[244316]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 23 09:29:45 np0005626463.localdomain sudo[244314]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4ca138c1babff33aa47b0f593cc672ab03770d4205069570de2d0e7691f07ed3-merged.mount: Deactivated successfully.
Feb 23 09:29:46 np0005626463.localdomain sudo[244424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uchhbwfnjdnqoahjkndunfsqazobgqrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771838986.3325372-2799-90394878853964/AnsiballZ_container_config_hash.py
Feb 23 09:29:46 np0005626463.localdomain sudo[244424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928-merged.mount: Deactivated successfully.
Feb 23 09:29:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928-merged.mount: Deactivated successfully.
Feb 23 09:29:46 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:29:46 np0005626463.localdomain python3.9[244426]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:29:46 np0005626463.localdomain sudo[244424]: pam_unix(sudo:session): session closed for user root
Feb 23 09:29:47 np0005626463.localdomain sudo[244535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmsvfusqixjokrxnwylhyizrdnqpwimz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771838987.1989658-2829-166133516252688/AnsiballZ_edpm_container_manage.py
Feb 23 09:29:47 np0005626463.localdomain sudo[244535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:29:47 np0005626463.localdomain python3[244537]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:29:48 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:48.235 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30021 DF PROTO=TCP SPT=56858 DPT=9101 SEQ=385358694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4BE060000000001030307) 
Feb 23 09:29:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:29:48.528 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:29:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:29:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:29:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:29:48.531 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:29:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7a6a75b4bc44910de031f240cbd770d29244a190eb01a1840ff2078eb2d894ad-merged.mount: Deactivated successfully.
Feb 23 09:29:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4ca138c1babff33aa47b0f593cc672ab03770d4205069570de2d0e7691f07ed3-merged.mount: Deactivated successfully.
Feb 23 09:29:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4ca138c1babff33aa47b0f593cc672ab03770d4205069570de2d0e7691f07ed3-merged.mount: Deactivated successfully.
Feb 23 09:29:49 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:49.655 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:29:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7a6a75b4bc44910de031f240cbd770d29244a190eb01a1840ff2078eb2d894ad-merged.mount: Deactivated successfully.
Feb 23 09:29:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7a6a75b4bc44910de031f240cbd770d29244a190eb01a1840ff2078eb2d894ad-merged.mount: Deactivated successfully.
Feb 23 09:29:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:29:50 np0005626463.localdomain systemd[1]: tmp-crun.pu5Yfz.mount: Deactivated successfully.
Feb 23 09:29:50 np0005626463.localdomain podman[244562]: 2026-02-23 09:29:50.951405458 +0000 UTC m=+0.119844803 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:29:50 np0005626463.localdomain podman[244562]: 2026-02-23 09:29:50.996201011 +0000 UTC m=+0.164640376 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:29:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:29:51 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:29:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:29:52 np0005626463.localdomain podman[244549]: 2026-02-23 09:29:48.845242608 +0000 UTC m=+0.045431033 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 23 09:29:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:53.285 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26770 DF PROTO=TCP SPT=35412 DPT=9102 SEQ=1579434002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4D4B00000000001030307) 
Feb 23 09:29:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:54.695 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928-merged.mount: Deactivated successfully.
Feb 23 09:29:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60162 DF PROTO=TCP SPT=52436 DPT=9882 SEQ=812001102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4D8070000000001030307) 
Feb 23 09:29:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:29:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:29:55 np0005626463.localdomain podman[244621]: 2026-02-23 09:29:55.554733341 +0000 UTC m=+0.090465704 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:29:55 np0005626463.localdomain podman[244621]: 2026-02-23 09:29:55.594253666 +0000 UTC m=+0.129985989 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:29:55 np0005626463.localdomain podman[244621]: unhealthy
Feb 23 09:29:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:29:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:56 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:29:56 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Failed with result 'exit-code'.
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.333 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.333 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.349 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.350 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.350 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:29:56 np0005626463.localdomain podman[244648]: 2026-02-23 09:29:56.171531291 +0000 UTC m=+0.594747361 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.783 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.783 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.783 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:29:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:56.784 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.161 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.180 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.180 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.180 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.180 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.181 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:29:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26772 DF PROTO=TCP SPT=35412 DPT=9102 SEQ=1579434002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4E0C70000000001030307) 
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.197 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.197 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.198 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.198 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.198 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:29:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:29:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eb85f8b1f9f1bb7644ed891399fb297bad9a6f983f4d7e10e6f8474d89d107e3-merged.mount: Deactivated successfully.
Feb 23 09:29:57 np0005626463.localdomain podman[244648]: 
Feb 23 09:29:57 np0005626463.localdomain podman[244648]: 2026-02-23 09:29:57.375271507 +0000 UTC m=+1.798487557 container create 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.596 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.672 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.674 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.853 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.854 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12600MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.854 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.854 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.916 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.916 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.917 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:29:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:57.951 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:29:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.352 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:29:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:29:58 np0005626463.localdomain python3[244537]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.405 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.411 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.430 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.433 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:29:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:58.434 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:29:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:29:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:29:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:29:59.730 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:30:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:30:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46-merged.mount: Deactivated successfully.
Feb 23 09:30:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46-merged.mount: Deactivated successfully.
Feb 23 09:30:00 np0005626463.localdomain sudo[244535]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:00 np0005626463.localdomain podman[244728]: 2026-02-23 09:30:00.244512234 +0000 UTC m=+0.113779652 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:30:00 np0005626463.localdomain podman[244728]: 2026-02-23 09:30:00.249209974 +0000 UTC m=+0.118477432 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:30:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47231 DF PROTO=TCP SPT=47228 DPT=9100 SEQ=106673070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4EF470000000001030307) 
Feb 23 09:30:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47232 DF PROTO=TCP SPT=47228 DPT=9100 SEQ=106673070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF4F7470000000001030307) 
Feb 23 09:30:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:03 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:30:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:03.372 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:03 np0005626463.localdomain sshd[244763]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:30:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:04.769 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:30:05 np0005626463.localdomain podman[244765]: 2026-02-23 09:30:05.20486666 +0000 UTC m=+0.082604627 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:30:05 np0005626463.localdomain podman[244765]: 2026-02-23 09:30:05.213180369 +0000 UTC m=+0.090918326 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:30:05 np0005626463.localdomain podman[244765]: unhealthy
Feb 23 09:30:05 np0005626463.localdomain sshd[244763]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:30:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33199 DF PROTO=TCP SPT=41354 DPT=9101 SEQ=4027743947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF504060000000001030307) 
Feb 23 09:30:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:06 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:30:06 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:30:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:07 np0005626463.localdomain sshd[244789]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:30:08 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:08.406 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:08 np0005626463.localdomain sshd[244789]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:30:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26774 DF PROTO=TCP SPT=35412 DPT=9102 SEQ=1579434002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF510060000000001030307) 
Feb 23 09:30:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:09 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:09.812 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab-merged.mount: Deactivated successfully.
Feb 23 09:30:11 np0005626463.localdomain sudo[244881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpwfnthvjhuylqgnjjbfgnhrogsgprtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839011.0179622-2853-144123176043367/AnsiballZ_stat.py
Feb 23 09:30:11 np0005626463.localdomain sudo[244881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:11 np0005626463.localdomain python3.9[244883]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:30:11 np0005626463.localdomain sudo[244881]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:30:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:30:12 np0005626463.localdomain sudo[244993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtpiyiudgdpmxrqhezwkfvhbsyadyjot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839011.8487685-2880-258605873782875/AnsiballZ_file.py
Feb 23 09:30:12 np0005626463.localdomain sudo[244993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:30:12 np0005626463.localdomain python3.9[244995]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:12 np0005626463.localdomain sudo[244993]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18680 DF PROTO=TCP SPT=52014 DPT=9882 SEQ=1849154630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF51C870000000001030307) 
Feb 23 09:30:12 np0005626463.localdomain sudo[245048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdtbmijhppjkszszspqcvricfyffqind ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839011.8487685-2880-258605873782875/AnsiballZ_stat.py
Feb 23 09:30:12 np0005626463.localdomain sudo[245048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:12 np0005626463.localdomain python3.9[245050]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:30:12 np0005626463.localdomain sudo[245048]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:13 np0005626463.localdomain sudo[245157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuizzdtvvspurpwpkxfhryczecamjbuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839012.8323994-2880-257287486795214/AnsiballZ_copy.py
Feb 23 09:30:13 np0005626463.localdomain sudo[245157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:13 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:13.445 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:13 np0005626463.localdomain python3.9[245159]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839012.8323994-2880-257287486795214/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:13 np0005626463.localdomain sudo[245157]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:30:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:30:13 np0005626463.localdomain sudo[245212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpwdaktchwtdpgwsfgwlgtoufwlgihvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839012.8323994-2880-257287486795214/AnsiballZ_systemd.py
Feb 23 09:30:13 np0005626463.localdomain sudo[245212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:30:14 np0005626463.localdomain python3.9[245214]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:30:14 np0005626463.localdomain systemd-rc-local-generator[245234]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:30:14 np0005626463.localdomain systemd-sysv-generator[245242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:14 np0005626463.localdomain sudo[245212]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:30:14 np0005626463.localdomain sudo[245302]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odxezpohnzcqezqbhpgoziwvwygjtpkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839012.8323994-2880-257287486795214/AnsiballZ_systemd.py
Feb 23 09:30:14 np0005626463.localdomain sudo[245302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:30:14 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:14.850 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:14 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18086 DF PROTO=TCP SPT=37020 DPT=9105 SEQ=113061989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF526070000000001030307) 
Feb 23 09:30:15 np0005626463.localdomain python3.9[245304]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:30:15 np0005626463.localdomain systemd-sysv-generator[245334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:30:15 np0005626463.localdomain systemd-rc-local-generator[245331]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: Starting openstack_network_exporter container...
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:30:15 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe6413339e1e502e752da5dfe5a434980dadc4fc02085f3f061ec29cf6ed9ebc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 23 09:30:15 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe6413339e1e502e752da5dfe5a434980dadc4fc02085f3f061ec29cf6ed9ebc/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:30:15 np0005626463.localdomain podman[245344]: 2026-02-23 09:30:15.602122514 +0000 UTC m=+0.166220117 container init 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, version=9.7, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *bridge.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *coverage.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *datapath.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *iface.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *memory.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *ovn.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *pmd_perf.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *pmd_rxq.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: INFO    09:30:15 main.go:48: registering *vswitch.Collector
Feb 23 09:30:15 np0005626463.localdomain openstack_network_exporter[245358]: NOTICE  09:30:15 main.go:82: listening on http://:9105/metrics
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:30:15 np0005626463.localdomain podman[245344]: 2026-02-23 09:30:15.635709087 +0000 UTC m=+0.199806690 container start 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:30:15 np0005626463.localdomain podman[245344]: openstack_network_exporter
Feb 23 09:30:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:30:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:30:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:30:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6-merged.mount: Deactivated successfully.
Feb 23 09:30:17 np0005626463.localdomain systemd[1]: Started openstack_network_exporter container.
Feb 23 09:30:17 np0005626463.localdomain podman[245368]: 2026-02-23 09:30:17.585034634 +0000 UTC m=+1.943194617 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=starting, architecture=x86_64, release=1770267347, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal)
Feb 23 09:30:17 np0005626463.localdomain sudo[245302]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:17 np0005626463.localdomain podman[245380]: 2026-02-23 09:30:17.641212839 +0000 UTC m=+0.814234624 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:30:17 np0005626463.localdomain podman[245368]: 2026-02-23 09:30:17.671055226 +0000 UTC m=+2.029215209 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1770267347, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, name=ubi9/ubi-minimal, version=9.7, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Feb 23 09:30:17 np0005626463.localdomain podman[245380]: 2026-02-23 09:30:17.680241302 +0000 UTC m=+0.853263127 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:30:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:18.468 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33201 DF PROTO=TCP SPT=41354 DPT=9101 SEQ=4027743947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF534060000000001030307) 
Feb 23 09:30:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:19.885 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:20 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:30:20 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:30:21 np0005626463.localdomain python3.9[245518]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:30:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:30:21 np0005626463.localdomain podman[245536]: 2026-02-23 09:30:21.891084785 +0000 UTC m=+0.067302582 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Feb 23 09:30:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:21 np0005626463.localdomain podman[245536]: 2026-02-23 09:30:21.948320333 +0000 UTC m=+0.124538120 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:30:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:22 np0005626463.localdomain sudo[245651]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfbezpqvqficnfzynubojdvmvjgsktbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839022.0011315-3015-23659875367626/AnsiballZ_stat.py
Feb 23 09:30:22 np0005626463.localdomain sudo[245651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:22 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:30:22 np0005626463.localdomain python3.9[245653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:30:22 np0005626463.localdomain sudo[245651]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:22 np0005626463.localdomain sudo[245741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbozawmiikzomagfrngjmqkvnhzvaltv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839022.0011315-3015-23659875367626/AnsiballZ_copy.py
Feb 23 09:30:22 np0005626463.localdomain sudo[245741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:23 np0005626463.localdomain python3.9[245743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839022.0011315-3015-23659875367626/.source.yaml _original_basename=.9kvq4_jy follow=False checksum=73079f9d8e3889e6f8afa55b7f38f8c9e8f4e33b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:23 np0005626463.localdomain sudo[245741]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:23 np0005626463.localdomain sudo[245851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcrxfkxnkskafqortaioqaxdnndwuxth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839023.2219198-3060-23318236458748/AnsiballZ_find.py
Feb 23 09:30:23 np0005626463.localdomain sudo[245851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:23.509 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:23 np0005626463.localdomain python3.9[245853]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:30:23 np0005626463.localdomain sudo[245851]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58080 DF PROTO=TCP SPT=49322 DPT=9102 SEQ=976543268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF549DF0000000001030307) 
Feb 23 09:30:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18682 DF PROTO=TCP SPT=52014 DPT=9882 SEQ=1849154630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF54C070000000001030307) 
Feb 23 09:30:24 np0005626463.localdomain sudo[245961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ronxfbuxipummtvovwgccweeaigyktds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839024.2942402-3088-229767609088546/AnsiballZ_podman_container_info.py
Feb 23 09:30:24 np0005626463.localdomain sudo[245961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:24 np0005626463.localdomain python3.9[245963]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 23 09:30:24 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:24.925 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:30:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3-merged.mount: Deactivated successfully.
Feb 23 09:30:27 np0005626463.localdomain podman[245975]: 2026-02-23 09:30:27.130642146 +0000 UTC m=+0.353426054 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute)
Feb 23 09:30:27 np0005626463.localdomain podman[245975]: 2026-02-23 09:30:27.138411948 +0000 UTC m=+0.361195816 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:30:27 np0005626463.localdomain podman[245975]: unhealthy
Feb 23 09:30:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58082 DF PROTO=TCP SPT=49322 DPT=9102 SEQ=976543268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF556060000000001030307) 
Feb 23 09:30:28 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:28.545 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 23 09:30:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 23 09:30:29 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:30:29 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Failed with result 'exit-code'.
Feb 23 09:30:29 np0005626463.localdomain sudo[245961]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:29 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:29.974 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:30 np0005626463.localdomain sudo[246102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwemdpbkkdjhmmpwmwqckzpwccrnjhni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839029.8846495-3096-81814989916621/AnsiballZ_podman_container_exec.py
Feb 23 09:30:30 np0005626463.localdomain sudo[246102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:30 np0005626463.localdomain python3.9[246104]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:30 np0005626463.localdomain systemd[1]: Started libpod-conmon-83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.scope.
Feb 23 09:30:30 np0005626463.localdomain podman[246105]: 2026-02-23 09:30:30.729827002 +0000 UTC m=+0.115972255 container exec 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:30:30 np0005626463.localdomain podman[246105]: 2026-02-23 09:30:30.761202107 +0000 UTC m=+0.147347380 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 09:30:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22937 DF PROTO=TCP SPT=54104 DPT=9100 SEQ=3673543175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF564460000000001030307) 
Feb 23 09:30:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:31 np0005626463.localdomain sudo[246136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:30:31 np0005626463.localdomain sudo[246136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:30:31 np0005626463.localdomain sudo[246136]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:31 np0005626463.localdomain sudo[246102]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:31 np0005626463.localdomain sudo[246154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:30:31 np0005626463.localdomain sudo[246154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:30:32 np0005626463.localdomain sudo[246291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oguuolaczamzdzkjtuoyfiyofkfpirxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839031.9374442-3104-124084241389852/AnsiballZ_podman_container_exec.py
Feb 23 09:30:32 np0005626463.localdomain sudo[246291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:32 np0005626463.localdomain python3.9[246293]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:32 np0005626463.localdomain systemd[1]: libpod-conmon-83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.scope: Deactivated successfully.
Feb 23 09:30:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22938 DF PROTO=TCP SPT=54104 DPT=9100 SEQ=3673543175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF56C460000000001030307) 
Feb 23 09:30:32 np0005626463.localdomain systemd[1]: Started libpod-conmon-83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.scope.
Feb 23 09:30:32 np0005626463.localdomain podman[246294]: 2026-02-23 09:30:32.939071564 +0000 UTC m=+0.520485294 container exec 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:30:32 np0005626463.localdomain podman[246294]: 2026-02-23 09:30:32.968549341 +0000 UTC m=+0.549963031 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:30:33 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:33.585 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:30:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:33 np0005626463.localdomain sudo[246291]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:33 np0005626463.localdomain podman[246327]: 2026-02-23 09:30:33.796771979 +0000 UTC m=+0.187834228 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:30:33 np0005626463.localdomain podman[246327]: 2026-02-23 09:30:33.804011003 +0000 UTC m=+0.195073262 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:30:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:34 np0005626463.localdomain sudo[246154]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:34 np0005626463.localdomain systemd[1]: libpod-conmon-83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.scope: Deactivated successfully.
Feb 23 09:30:34 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:30:34 np0005626463.localdomain sudo[246472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eywxapmakbpirxkpijxvshqyyugemuyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839033.9249141-3112-253834211664378/AnsiballZ_file.py
Feb 23 09:30:34 np0005626463.localdomain sudo[246472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:34 np0005626463.localdomain python3.9[246474]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:34 np0005626463.localdomain sudo[246472]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:34 np0005626463.localdomain sudo[246582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnxgodxrsfmkwbdkesnkpwpzifacaplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839034.6996055-3121-265324635360380/AnsiballZ_podman_container_info.py
Feb 23 09:30:34 np0005626463.localdomain sudo[246582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:35.005 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:35 np0005626463.localdomain python3.9[246584]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 23 09:30:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20946 DF PROTO=TCP SPT=44306 DPT=9101 SEQ=2700701582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF579470000000001030307) 
Feb 23 09:30:36 np0005626463.localdomain sudo[246597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:30:36 np0005626463.localdomain sudo[246597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:30:36 np0005626463.localdomain sudo[246597]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:30:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 23 09:30:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686-merged.mount: Deactivated successfully.
Feb 23 09:30:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686-merged.mount: Deactivated successfully.
Feb 23 09:30:36 np0005626463.localdomain podman[246615]: 2026-02-23 09:30:36.960672807 +0000 UTC m=+0.471293106 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:30:36 np0005626463.localdomain podman[246615]: 2026-02-23 09:30:36.994644673 +0000 UTC m=+0.505264972 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:30:37 np0005626463.localdomain podman[246615]: unhealthy
Feb 23 09:30:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:30:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:30:38 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:30:38 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:30:38 np0005626463.localdomain sudo[246582]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:38 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:38.634 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62332 DF PROTO=TCP SPT=40480 DPT=9882 SEQ=2939822891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF585930000000001030307) 
Feb 23 09:30:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:39 np0005626463.localdomain sudo[246746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ookufpkwdfpdiintcwarugdsovihkvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839038.532636-3129-147145722412048/AnsiballZ_podman_container_exec.py
Feb 23 09:30:39 np0005626463.localdomain sudo[246746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:39 np0005626463.localdomain python3.9[246748]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.scope.
Feb 23 09:30:40 np0005626463.localdomain podman[246749]: 2026-02-23 09:30:40.004962 +0000 UTC m=+0.107476062 container exec 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb 23 09:30:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:40 np0005626463.localdomain podman[246749]: 2026-02-23 09:30:40.039448891 +0000 UTC m=+0.141963003 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:30:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:40.058 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:40 np0005626463.localdomain sudo[246746]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:40 np0005626463.localdomain systemd[1]: libpod-conmon-11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.scope: Deactivated successfully.
Feb 23 09:30:40 np0005626463.localdomain sudo[246885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvsgronbepwqywdnymtuvqgifnmuvedh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839040.3777158-3137-37029312271905/AnsiballZ_podman_container_exec.py
Feb 23 09:30:40 np0005626463.localdomain sudo[246885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:40 np0005626463.localdomain python3.9[246887]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.scope.
Feb 23 09:30:40 np0005626463.localdomain podman[246888]: 2026-02-23 09:30:40.973731725 +0000 UTC m=+0.101651091 container exec 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:30:40 np0005626463.localdomain podman[246888]: 2026-02-23 09:30:40.981211726 +0000 UTC m=+0.109131082 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent)
Feb 23 09:30:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:30:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067-merged.mount: Deactivated successfully.
Feb 23 09:30:41 np0005626463.localdomain sudo[246885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:42 np0005626463.localdomain sudo[247026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xposmgtjdhuanghxcymyczaannmwunrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839041.9856853-3145-154552174228987/AnsiballZ_file.py
Feb 23 09:30:42 np0005626463.localdomain sudo[247026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62334 DF PROTO=TCP SPT=40480 DPT=9882 SEQ=2939822891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF591860000000001030307) 
Feb 23 09:30:42 np0005626463.localdomain python3.9[247028]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:42 np0005626463.localdomain sudo[247026]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:42 np0005626463.localdomain sudo[247136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpjdyncpbbacllatziqtvqhfbrwgqhgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839042.6828117-3154-169618199207807/AnsiballZ_podman_container_info.py
Feb 23 09:30:42 np0005626463.localdomain sudo[247136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:43 np0005626463.localdomain python3.9[247138]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 23 09:30:43 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:43.681 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:30:44 np0005626463.localdomain systemd[1]: libpod-conmon-11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.scope: Deactivated successfully.
Feb 23 09:30:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:45.102 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22940 DF PROTO=TCP SPT=54104 DPT=9100 SEQ=3673543175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF59C060000000001030307) 
Feb 23 09:30:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:30:46 np0005626463.localdomain sshd[247150]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:30:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:47 np0005626463.localdomain sudo[247136]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:47 np0005626463.localdomain sshd[247223]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:30:47 np0005626463.localdomain sudo[247261]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eslzqvouaqgfdaupbbqwstwtzpvrvtso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839047.2809787-3162-6490781576992/AnsiballZ_podman_container_exec.py
Feb 23 09:30:47 np0005626463.localdomain sudo[247261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:47 np0005626463.localdomain python3.9[247263]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:47 np0005626463.localdomain sshd[247223]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:30:47 np0005626463.localdomain systemd[1]: Started libpod-conmon-be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.scope.
Feb 23 09:30:47 np0005626463.localdomain podman[247264]: 2026-02-23 09:30:47.900193136 +0000 UTC m=+0.116967116 container exec be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:30:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:47 np0005626463.localdomain podman[247264]: 2026-02-23 09:30:47.936433712 +0000 UTC m=+0.153207662 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 23 09:30:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:48 np0005626463.localdomain sshd[247150]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:30:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:48 np0005626463.localdomain sudo[247261]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:30:48.530 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:30:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:30:48.532 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:30:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:30:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:30:48 np0005626463.localdomain sudo[247399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wetoxkigjjkjclunyiruhxqvgitlruxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839048.3597112-3170-62617236833210/AnsiballZ_podman_container_exec.py
Feb 23 09:30:48 np0005626463.localdomain sudo[247399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:48 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:48.705 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20948 DF PROTO=TCP SPT=44306 DPT=9101 SEQ=2700701582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5AA060000000001030307) 
Feb 23 09:30:48 np0005626463.localdomain python3.9[247401]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6c9a1aacc166b15a11e7e7c477a5bec4f993243887d4de4680eae8258483d960-merged.mount: Deactivated successfully.
Feb 23 09:30:50 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:50.137 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6c9a1aacc166b15a11e7e7c477a5bec4f993243887d4de4680eae8258483d960-merged.mount: Deactivated successfully.
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: libpod-conmon-be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.scope: Deactivated successfully.
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: Started libpod-conmon-be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.scope.
Feb 23 09:30:50 np0005626463.localdomain podman[247402]: 2026-02-23 09:30:50.197935469 +0000 UTC m=+1.366944239 container exec be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:30:50 np0005626463.localdomain podman[247402]: 2026-02-23 09:30:50.23045781 +0000 UTC m=+1.399466660 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:30:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:30:52 np0005626463.localdomain sshd[247455]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:30:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:30:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:52 np0005626463.localdomain sshd[247455]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 09:30:52 np0005626463.localdomain sshd[247455]: Connection closed by 170.64.235.159 port 53674
Feb 23 09:30:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:52 np0005626463.localdomain sudo[247399]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:52 np0005626463.localdomain podman[247433]: 2026-02-23 09:30:52.756251479 +0000 UTC m=+1.919243302 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.7, release=1770267347, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64)
Feb 23 09:30:52 np0005626463.localdomain podman[247433]: 2026-02-23 09:30:52.77524939 +0000 UTC m=+1.938241203 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, architecture=x86_64, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7)
Feb 23 09:30:52 np0005626463.localdomain podman[247456]: 2026-02-23 09:30:52.868038303 +0000 UTC m=+0.485443396 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 09:30:52 np0005626463.localdomain podman[247434]: 2026-02-23 09:30:52.943738436 +0000 UTC m=+2.107019888 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:30:52 np0005626463.localdomain podman[247434]: 2026-02-23 09:30:52.955304566 +0000 UTC m=+2.118586038 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:30:53 np0005626463.localdomain podman[247456]: 2026-02-23 09:30:53.006182507 +0000 UTC m=+0.623587610 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:30:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:53.704 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:53 np0005626463.localdomain sudo[247606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xudbxhzvjpfzbhzjbtsumnwdtekiaebz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839052.856692-3178-10097014043913/AnsiballZ_file.py
Feb 23 09:30:53 np0005626463.localdomain sudo[247606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:53 np0005626463.localdomain python3.9[247608]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:30:53 np0005626463.localdomain sudo[247606]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38986 DF PROTO=TCP SPT=40222 DPT=9102 SEQ=2067140171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5BF100000000001030307) 
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain sudo[247716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nricopcwfnvtbrradzsrtwsazvgjihtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839054.13151-3187-268000873982889/AnsiballZ_podman_container_info.py
Feb 23 09:30:54 np0005626463.localdomain sudo[247716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:54 np0005626463.localdomain python3.9[247718]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: libpod-conmon-be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.scope: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:30:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62336 DF PROTO=TCP SPT=40480 DPT=9882 SEQ=2939822891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5C2060000000001030307) 
Feb 23 09:30:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:55.142 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.156 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.157 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4abf2c3d-878e-4ddf-8ac3-186a11fbb6ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.157767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5af18a0e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': 'e85aff95d9349bc7c61a66b13de890cc52d3941cabf14641fac4486d719b5190'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.157767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5af1a2aa-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '8f796aee5c7df35c24bc8736f7db21519d4ed8e9014c2a8395eee2eb835595e9'}]}, 'timestamp': '2026-02-23 09:30:56.199259', '_unique_id': '55d6162609d84361865922138a21aa09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.202 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bab7481-65c1-465f-b1e5-04108a5a307c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.202491', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af3088e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '332ebb428ed5a72eb5b1c93aea5da200328c468b70710594b3e7d0c446c895c0'}]}, 'timestamp': '2026-02-23 09:30:56.208352', '_unique_id': '1e1fdde16f8c4d169fd45b247ce9eb9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '189de067-eeb1-4618-898b-c938a6d8e1f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.210944', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af381b0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': 'b3a28e1757d5e0e0a7035b134c694b7b3fff0fe9d4799d92460129fa3e67848e'}]}, 'timestamp': '2026-02-23 09:30:56.211435', '_unique_id': 'b2bd596ed28a4cda9bcef478a4398201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8ca607e-c8db-4e0f-8c3d-e5fbcb55d59d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.213974', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af3f82a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '8e551e2022cb5132ad29b1fded7f030faa68c681b374977015de1951ab76a0d0'}]}, 'timestamp': '2026-02-23 09:30:56.214470', '_unique_id': '84b406e89b734d7ab9f55c14e6f9095e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.216 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3bce2f1-7f98-47af-8fa6-67c32fcc85b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:30:56.216921', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5af888c2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.432968649, 'message_signature': '772bedafea64251e1c9dbc97c17f383e047788ebd59e0ceaf76d985a6f5d86b1'}]}, 'timestamp': '2026-02-23 09:30:56.244447', '_unique_id': '059714058c2f46fe9884a681f698fd3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff08c002-dc5e-430e-8a30-d492171a99cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.247287', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af90e32-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '8cca035ea2d8908c1bbc64a2103eb45f4e0e7c5d06fd0685f67a00dea4138684'}]}, 'timestamp': '2026-02-23 09:30:56.247845', '_unique_id': '0c7e5dbf3da84cb2b5d1e4cf1495e656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3183a255-6f3a-4b2a-84b5-47d17c84fc89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.250442', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af988b2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '2e6116897f69f03e1e9718ed65746cd475c3c453ec0e3214e1e90f02622598f9'}]}, 'timestamp': '2026-02-23 09:30:56.250978', '_unique_id': 'b68685448090435b86d37f8d6c986a0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfdc7aeb-0b08-41f3-bcda-08d8d00eeb57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.253216', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5af9f482-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': 'be4996df4dc532cc04c46e1306bcee617de278cc704a5849165b1b43f4880543'}]}, 'timestamp': '2026-02-23 09:30:56.253728', '_unique_id': 'ea415e1b6c864ff9a6557413d0cf40dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15eed780-3efe-4429-bab9-1e49de3baf5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.256156', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5afa6746-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '45604327ace1015647cda1571ad2d92545c09634a5724ce7c0072bc46a83c798'}]}, 'timestamp': '2026-02-23 09:30:56.256631', '_unique_id': 'f72f7c6da4d140d794c66ed0afe2d3f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '030e4630-5136-4da8-9fca-c5e6fc462cb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.259040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5afadad2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '675995b52ddb76a1b6e2c2f76b21efc87b48e6ed9a26749b02d547018ac1e12e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.259040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5afaf274-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '6338e442ce689acb76dbeddec6d828af65280528f2f6d9083e8415c1b590d41f'}]}, 'timestamp': '2026-02-23 09:30:56.260171', '_unique_id': '3f6117f199f04be993a30413f337a900'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.275 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.276 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a89c880d-85a3-4bc1-b389-137c1745ff2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.262503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5afd6d42-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': 'a0d082da41c4c81a64a9fa8caa8a11dd57af4230d196ccddf301a877aa3e3963'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.262503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5afd7eea-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': 'e9fcd4cfee75970be88e1a5ff7b9beb5cbd62dd3f0cb575453971af6f671f946'}]}, 'timestamp': '2026-02-23 09:30:56.276896', '_unique_id': '712d41d2c6354f01a2216bc92f402980'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.279 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.279 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46bb56ab-fd11-42d6-9665-93309f6bcdd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.279249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5afded76-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '566afd2562d593b4f7c2c48e6b9920648224ae9c94cc38c72d50a0790396a07e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.279249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5afdff8c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '79297d97a9e6b2aa76ae6140c29f42002b0857ea42c3fdd64a64dafa11f2f473'}]}, 'timestamp': '2026-02-23 09:30:56.280156', '_unique_id': 'e579d1fc212f4baa849780bd79591537'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.282 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.282 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.283 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28cefd68-aec7-453d-8908-9e64b358066b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.282544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5afe6e04-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': 'fd9e89a263f611612553c7fae1cd0e4219c7bdffa3310fb0d794f02f1763198b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.282544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5afe7fde-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '0d48df9199518b75900fb6059529c30d3210fff76beb824a365573153d77be4e'}]}, 'timestamp': '2026-02-23 09:30:56.283442', '_unique_id': 'fd46947d6d4f4b9fa5f111d6690135d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.284 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.285 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.285 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5db4ad0-d8c6-4b31-a4b3-0e4b73a7fa7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.285632', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5afee87a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '4eeb42e9e933f3993bab74451444cb179c72e56bd8785251920fb67a182e8cbd'}]}, 'timestamp': '2026-02-23 09:30:56.286149', '_unique_id': '4023a53080b54613a18f68dd031dfba1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.288 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c6b323e-6d72-4152-8a7a-05547cbd170b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.288274', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5aff4d7e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': 'fa3360f9b63557875723edb1921b583d1ab4a28ba9fef6dd89f491aebb2cfaf0'}]}, 'timestamp': '2026-02-23 09:30:56.288733', '_unique_id': '683914fe22304df9a0aef0b6a667828f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.289 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.290 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.290 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 53790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1355dfa-a773-4962-8549-6066658c85dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53790000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:30:56.290858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5affafee-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.432968649, 'message_signature': 'e17f3b48f2a6f58df66dfc420e5cd3f713d59fd28efd6782f2071e51e44ff079'}]}, 'timestamp': '2026-02-23 09:30:56.291157', '_unique_id': '1587a0220a444d86b45620574594caea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.291 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.292 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.292 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6111e27e-50c6-427e-ad1e-f1a53294ed30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.292501', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5affef86-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '28df1d6342b6817e6c4eb26ed422f3aa485c8033cc1fadfe3223e328cdf249e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.292501', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5afffaee-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': 'f780863138d1dea62c8e8ca355b8a38e512a27190a45ee889b5cd0fe7357e941'}]}, 'timestamp': '2026-02-23 09:30:56.293069', '_unique_id': 'd5a2192a893e4b0f8f158963e56d167f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.293 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.294 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.294 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '976651ea-6b09-4467-9303-b37ce2bb0d28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:30:56.294458', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '5b003c16-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.392003176, 'message_signature': '987b210ba53d30d773ca79a529ae59cdaec00dacdce53ac05618da0f6a931ec0'}]}, 'timestamp': '2026-02-23 09:30:56.294756', '_unique_id': '25d33b1486ad4bbbabc8403d3be724a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.295 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.296 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.296 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d943818-0775-41a5-92cf-e64ab8f16407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.296111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5b007cb2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': '99de0aef48a7db65b08b2f6fcfd8a054a4c795e67f63b0b377b2ddf03f741585'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.296111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5b008892-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': 'a0f6185313234f808dfb833aa368bb167aa1b2bb46a57650e175249496a66fab'}]}, 'timestamp': '2026-02-23 09:30:56.296695', '_unique_id': 'a78b0b05018a4cf7b2ec804cf44a7743'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.297 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.298 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.298 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be39d66-3927-47d4-828f-4851948da90d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.298117', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5b00cb2c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': '46438223162792f4b3c8ba90dae2c2fcb4ef2a19ae4766be6cb8a486957e2603'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.298117', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5b00d61c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.45199431, 'message_signature': '2a08b5ce9cb4f4ccf22fbbca69a527cb2796fffd4a62fab48b8c7d37ae5ad583'}]}, 'timestamp': '2026-02-23 09:30:56.298687', '_unique_id': 'a49a53a42b454885ad7ee63bc4e9033f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.300 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.300 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2128cf6b-25ce-4c1e-883c-509abb640ef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:30:56.300085', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5b0117c6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': 'bab10623e29f80cae9ca92ec51ff112e888e393143ab8a1bb855477eb470e241'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:30:56.300085', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5b01227a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10496.347262516, 'message_signature': '1a2e329dd95a1a7eb2479b477dec9e602605e05ddc04055256ce6815a1bb0b22'}]}, 'timestamp': '2026-02-23 09:30:56.300636', '_unique_id': '5c068adaca164090b02c9fc2fffce291'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:30:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:30:56.301 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:30:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:30:56 np0005626463.localdomain sudo[247716]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:30:57 np0005626463.localdomain sudo[247838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqbvdzyofsbxqlmzjiiyektuxjsueqwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839056.803694-3195-121458152449732/AnsiballZ_podman_container_exec.py
Feb 23 09:30:57 np0005626463.localdomain sudo[247838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:30:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38988 DF PROTO=TCP SPT=40222 DPT=9102 SEQ=2067140171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5CB060000000001030307) 
Feb 23 09:30:57 np0005626463.localdomain python3.9[247840]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:30:57 np0005626463.localdomain systemd[1]: Started libpod-conmon-bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.scope.
Feb 23 09:30:57 np0005626463.localdomain podman[247841]: 2026-02-23 09:30:57.424768726 +0000 UTC m=+0.107172012 container exec bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:30:57 np0005626463.localdomain podman[247841]: 2026-02-23 09:30:57.458315128 +0000 UTC m=+0.140718464 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:30:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:58.436 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:30:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:58.436 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:30:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:58.437 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:30:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:58.437 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:30:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:58.745 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:30:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:59.189 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:30:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:59.189 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:30:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:59.189 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:30:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:30:59.190 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:30:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:30:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460-merged.mount: Deactivated successfully.
Feb 23 09:30:59 np0005626463.localdomain sudo[247838]: pam_unix(sudo:session): session closed for user root
Feb 23 09:30:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:31:00 np0005626463.localdomain sudo[247988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gisrzwwpaxjeezcsdoxgyfgiilbsmbye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839059.8298423-3203-198313559639062/AnsiballZ_podman_container_exec.py
Feb 23 09:31:00 np0005626463.localdomain sudo[247988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.183 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.278 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.293 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.294 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.295 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.295 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.296 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.296 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.296 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.297 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.298 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.298 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.315 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.315 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.316 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.316 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.316 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:31:00 np0005626463.localdomain python3.9[247990]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:31:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 23 09:31:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 23 09:31:00 np0005626463.localdomain systemd[1]: libpod-conmon-bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.scope: Deactivated successfully.
Feb 23 09:31:00 np0005626463.localdomain podman[247902]: 2026-02-23 09:31:00.545026699 +0000 UTC m=+0.716455966 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:31:00 np0005626463.localdomain podman[247902]: 2026-02-23 09:31:00.555643939 +0000 UTC m=+0.727073236 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:31:00 np0005626463.localdomain systemd[1]: Started libpod-conmon-bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.scope.
Feb 23 09:31:00 np0005626463.localdomain podman[247992]: 2026-02-23 09:31:00.678459035 +0000 UTC m=+0.338829431 container exec bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:31:00 np0005626463.localdomain podman[247992]: 2026-02-23 09:31:00.708748336 +0000 UTC m=+0.369118722 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.802 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:31:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8254 DF PROTO=TCP SPT=53622 DPT=9100 SEQ=2225582548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5D9860000000001030307) 
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.881 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:31:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:00.881 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.047 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.048 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12519MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.048 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.048 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.108 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.108 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.108 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.166 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:31:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 23 09:31:01 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:31:01 np0005626463.localdomain sudo[247988]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.685 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.692 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.708 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.710 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:31:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:01.710 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:31:01 np0005626463.localdomain sudo[248179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uscquyfomgkhdwvtfacicnqfxfhupeiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839061.5816755-3211-167967063790882/AnsiballZ_file.py
Feb 23 09:31:01 np0005626463.localdomain sudo[248179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:02 np0005626463.localdomain systemd[1]: libpod-conmon-bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.scope: Deactivated successfully.
Feb 23 09:31:02 np0005626463.localdomain rsyslogd[758]: imjournal: 5038 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 23 09:31:02 np0005626463.localdomain python3.9[248181]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:31:02 np0005626463.localdomain sudo[248179]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:02 np0005626463.localdomain sudo[248289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycvxcoxptbravlrkxaiizfpfflmwrekv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839062.277927-3220-141206199581939/AnsiballZ_podman_container_info.py
Feb 23 09:31:02 np0005626463.localdomain sudo[248289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:02 np0005626463.localdomain python3.9[248291]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 23 09:31:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8255 DF PROTO=TCP SPT=53622 DPT=9100 SEQ=2225582548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5E1860000000001030307) 
Feb 23 09:31:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 23 09:31:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-025f13926bcfeedaf7e085dde432a5009541a3e067b7031c72f0d516a81ad107-merged.mount: Deactivated successfully.
Feb 23 09:31:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:03.779 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:31:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 23 09:31:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 23 09:31:04 np0005626463.localdomain sudo[248289]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:04 np0005626463.localdomain podman[248303]: 2026-02-23 09:31:04.799807878 +0000 UTC m=+0.159450016 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 23 09:31:04 np0005626463.localdomain podman[248303]: 2026-02-23 09:31:04.835257119 +0000 UTC m=+0.194899257 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 23 09:31:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:05.227 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:05 np0005626463.localdomain sudo[248427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cekzcmpbofkzukuwrjssankeyorlrfdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839064.9722352-3228-102881035372050/AnsiballZ_podman_container_exec.py
Feb 23 09:31:05 np0005626463.localdomain sudo[248427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:05 np0005626463.localdomain python3.9[248429]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:31:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13504 DF PROTO=TCP SPT=40444 DPT=9101 SEQ=3526832977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5EE860000000001030307) 
Feb 23 09:31:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:06 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:31:06 np0005626463.localdomain systemd[1]: Started libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope.
Feb 23 09:31:06 np0005626463.localdomain podman[248430]: 2026-02-23 09:31:06.403691428 +0000 UTC m=+0.895375364 container exec da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:31:06 np0005626463.localdomain podman[248430]: 2026-02-23 09:31:06.432896566 +0000 UTC m=+0.924580512 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:31:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:07 np0005626463.localdomain sudo[248427]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:07 np0005626463.localdomain sudo[248566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdehshtgnotnfnywprufiofotltreudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839067.6513605-3236-146715341199868/AnsiballZ_podman_container_exec.py
Feb 23 09:31:07 np0005626463.localdomain sudo[248566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:08 np0005626463.localdomain python3.9[248568]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:31:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:08 np0005626463.localdomain systemd[1]: libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope: Deactivated successfully.
Feb 23 09:31:08 np0005626463.localdomain systemd[1]: Started libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope.
Feb 23 09:31:08 np0005626463.localdomain podman[248569]: 2026-02-23 09:31:08.292528934 +0000 UTC m=+0.102106103 container exec da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:31:08 np0005626463.localdomain podman[248569]: 2026-02-23 09:31:08.326329974 +0000 UTC m=+0.135907133 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:31:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:31:08 np0005626463.localdomain sudo[248566]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:08 np0005626463.localdomain podman[248598]: 2026-02-23 09:31:08.616312356 +0000 UTC m=+0.189423757 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:31:08 np0005626463.localdomain podman[248598]: 2026-02-23 09:31:08.628712841 +0000 UTC m=+0.201824292 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:31:08 np0005626463.localdomain podman[248598]: unhealthy
Feb 23 09:31:08 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:08.810 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:09 np0005626463.localdomain sudo[248727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-howwzitcftxzdiwhfrigmmuiilcpzdcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839068.7072601-3244-66123410898351/AnsiballZ_file.py
Feb 23 09:31:09 np0005626463.localdomain sudo[248727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:09 np0005626463.localdomain python3.9[248729]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:31:09 np0005626463.localdomain sudo[248727]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60658 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5FAC30000000001030307) 
Feb 23 09:31:09 np0005626463.localdomain sudo[248837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoqjupsixbcltibnxzhxvlyxhvumdkfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839069.4778447-3253-223779885905370/AnsiballZ_podman_container_info.py
Feb 23 09:31:09 np0005626463.localdomain sudo[248837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057-merged.mount: Deactivated successfully.
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:31:09 np0005626463.localdomain systemd[1]: libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope: Deactivated successfully.
Feb 23 09:31:09 np0005626463.localdomain python3.9[248839]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 23 09:31:10 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:10.265 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 23 09:31:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:31:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:31:11 np0005626463.localdomain sudo[248837]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:11 np0005626463.localdomain sudo[248960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejwijejznfgqvnqgnnimtyialukjqvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839071.2414377-3261-232733182722925/AnsiballZ_podman_container_exec.py
Feb 23 09:31:11 np0005626463.localdomain sudo[248960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 23 09:31:11 np0005626463.localdomain python3.9[248962]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:31:11 np0005626463.localdomain systemd[1]: Started libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope.
Feb 23 09:31:11 np0005626463.localdomain podman[248963]: 2026-02-23 09:31:11.796600364 +0000 UTC m=+0.095305962 container exec 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347)
Feb 23 09:31:11 np0005626463.localdomain podman[248963]: 2026-02-23 09:31:11.826510594 +0000 UTC m=+0.125216212 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:31:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:31:12 np0005626463.localdomain sudo[248960]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60660 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF606C60000000001030307) 
Feb 23 09:31:12 np0005626463.localdomain sudo[249099]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bofjlqzojzrbpcvygjecknofmotiwtiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839072.448076-3269-258251854621609/AnsiballZ_podman_container_exec.py
Feb 23 09:31:12 np0005626463.localdomain sudo[249099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:12 np0005626463.localdomain python3.9[249101]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 23 09:31:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:31:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:31:13 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:13.864 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:13 np0005626463.localdomain systemd[1]: libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope: Deactivated successfully.
Feb 23 09:31:13 np0005626463.localdomain systemd[1]: Started libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope.
Feb 23 09:31:13 np0005626463.localdomain podman[249102]: 2026-02-23 09:31:13.937984039 +0000 UTC m=+1.016423487 container exec 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, distribution-scope=public, version=9.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:31:13 np0005626463.localdomain podman[249102]: 2026-02-23 09:31:13.970187769 +0000 UTC m=+1.048627237 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 23 09:31:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:15 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:15.306 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22202 DF PROTO=TCP SPT=35286 DPT=9105 SEQ=2453412937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF612060000000001030307) 
Feb 23 09:31:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:31:15 np0005626463.localdomain sudo[249099]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:15 np0005626463.localdomain sudo[249237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcusqwxmlibhgsevcgaoeofcrjjqvwds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839075.5575156-3277-222408598029335/AnsiballZ_file.py
Feb 23 09:31:15 np0005626463.localdomain sudo[249237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:31:16 np0005626463.localdomain python3.9[249239]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:31:16 np0005626463.localdomain sudo[249237]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:16 np0005626463.localdomain systemd[1]: libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope: Deactivated successfully.
Feb 23 09:31:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13506 DF PROTO=TCP SPT=40444 DPT=9101 SEQ=3526832977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF61E070000000001030307) 
Feb 23 09:31:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:31:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3e3fe691531a0d3ed4e0bd844aee95e09028b37c75ae9985cc1386696cb9ad2a-merged.mount: Deactivated successfully.
Feb 23 09:31:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:18.914 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 23 09:31:20 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:20.352 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 23 09:31:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572-merged.mount: Deactivated successfully.
Feb 23 09:31:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf-merged.mount: Deactivated successfully.
Feb 23 09:31:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully.
Feb 23 09:31:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully.
Feb 23 09:31:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:31:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf-merged.mount: Deactivated successfully.
Feb 23 09:31:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:31:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:31:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:23.942 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23613 DF PROTO=TCP SPT=39128 DPT=9102 SEQ=3298852228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF634400000000001030307) 
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully.
Feb 23 09:31:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60662 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF636060000000001030307) 
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:31:24 np0005626463.localdomain podman[249258]: 2026-02-23 09:31:24.912749016 +0000 UTC m=+0.082689570 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 23 09:31:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:31:24 np0005626463.localdomain podman[249258]: 2026-02-23 09:31:24.966265714 +0000 UTC m=+0.136206268 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Feb 23 09:31:25 np0005626463.localdomain podman[249257]: 2026-02-23 09:31:24.96612806 +0000 UTC m=+0.137152698 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347)
Feb 23 09:31:25 np0005626463.localdomain podman[249257]: 2026-02-23 09:31:25.059242394 +0000 UTC m=+0.230267042 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Feb 23 09:31:25 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:31:25 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:31:25 np0005626463.localdomain podman[249259]: 2026-02-23 09:31:25.301733836 +0000 UTC m=+0.467982609 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:31:25 np0005626463.localdomain podman[249259]: 2026-02-23 09:31:25.31178698 +0000 UTC m=+0.478035773 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:31:25 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:25.354 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:25 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:31:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:31:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23615 DF PROTO=TCP SPT=39128 DPT=9102 SEQ=3298852228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF640460000000001030307) 
Feb 23 09:31:27 np0005626463.localdomain sshd[249323]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:31:27 np0005626463.localdomain sshd[249323]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:31:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:31:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:31:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:31:28 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:28.994 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:30 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:30.389 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:31:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:31:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully.
Feb 23 09:31:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50002 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF64EC60000000001030307) 
Feb 23 09:31:30 np0005626463.localdomain sshd[249325]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:31:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:31:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:31:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:31:31 np0005626463.localdomain podman[249326]: 2026-02-23 09:31:31.561847008 +0000 UTC m=+0.084181016 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:31:31 np0005626463.localdomain podman[249326]: 2026-02-23 09:31:31.575210161 +0000 UTC m=+0.097544179 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 23 09:31:31 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:31:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:31:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:31:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:31:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50003 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF656C60000000001030307) 
Feb 23 09:31:32 np0005626463.localdomain sshd[249325]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:31:33 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:33.995 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully.
Feb 23 09:31:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:35.421 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28380 DF PROTO=TCP SPT=60904 DPT=9101 SEQ=1954125877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF663860000000001030307) 
Feb 23 09:31:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:31:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:31:36 np0005626463.localdomain podman[249346]: 2026-02-23 09:31:36.736848226 +0000 UTC m=+0.084705442 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 23 09:31:36 np0005626463.localdomain sudo[249354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:31:36 np0005626463.localdomain sudo[249354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:31:36 np0005626463.localdomain sudo[249354]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:36 np0005626463.localdomain podman[249346]: 2026-02-23 09:31:36.771310167 +0000 UTC m=+0.119167363 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent)
Feb 23 09:31:36 np0005626463.localdomain sudo[249382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:31:36 np0005626463.localdomain sudo[249382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:31:36 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:31:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:31:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:39 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:39.011 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:39 np0005626463.localdomain sudo[249382]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35897 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF66FF20000000001030307) 
Feb 23 09:31:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:40 np0005626463.localdomain sudo[249431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:31:40 np0005626463.localdomain sudo[249431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:31:40 np0005626463.localdomain sudo[249431]: pam_unix(sudo:session): session closed for user root
Feb 23 09:31:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:40.425 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:40 np0005626463.localdomain podman[249449]: 2026-02-23 09:31:40.449600427 +0000 UTC m=+0.083696951 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:31:40 np0005626463.localdomain podman[249449]: 2026-02-23 09:31:40.493762243 +0000 UTC m=+0.127858787 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:31:40 np0005626463.localdomain podman[249449]: unhealthy
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35899 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF67C060000000001030307) 
Feb 23 09:31:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:31:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7-merged.mount: Deactivated successfully.
Feb 23 09:31:44 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:44.044 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 23 09:31:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:45 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50005 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF686060000000001030307) 
Feb 23 09:31:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:45.471 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 23 09:31:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully.
Feb 23 09:31:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:31:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:31:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:31:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360-merged.mount: Deactivated successfully.
Feb 23 09:31:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:31:48.532 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:31:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:31:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:31:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:31:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:31:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28382 DF PROTO=TCP SPT=60904 DPT=9101 SEQ=1954125877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF694060000000001030307) 
Feb 23 09:31:49 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:49.084 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 23 09:31:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 23 09:31:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:50.515 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 23 09:31:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully.
Feb 23 09:31:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully.
Feb 23 09:31:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:54.122 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33650 DF PROTO=TCP SPT=54694 DPT=9102 SEQ=3463304677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6A99F0000000001030307) 
Feb 23 09:31:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:31:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:31:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35901 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6AC060000000001030307) 
Feb 23 09:31:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:55.560 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:31:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:31:55 np0005626463.localdomain systemd[1]: tmp-crun.LvZYfc.mount: Deactivated successfully.
Feb 23 09:31:55 np0005626463.localdomain podman[249472]: 2026-02-23 09:31:55.929137465 +0000 UTC m=+0.102093567 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 09:31:55 np0005626463.localdomain podman[249472]: 2026-02-23 09:31:55.970134565 +0000 UTC m=+0.143090697 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.expose-services=, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 23 09:31:55 np0005626463.localdomain podman[249474]: 2026-02-23 09:31:55.984782977 +0000 UTC m=+0.148413377 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:31:55 np0005626463.localdomain podman[249474]: 2026-02-23 09:31:55.995185492 +0000 UTC m=+0.158815902 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: tmp-crun.nd1vyX.mount: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:31:56 np0005626463.localdomain podman[249473]: 2026-02-23 09:31:56.57765929 +0000 UTC m=+0.746875259 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:31:56 np0005626463.localdomain podman[249473]: 2026-02-23 09:31:56.659213307 +0000 UTC m=+0.828429246 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:31:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33652 DF PROTO=TCP SPT=54694 DPT=9102 SEQ=3463304677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6B5C60000000001030307) 
Feb 23 09:31:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:31:57 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:31:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:31:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.170 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.810 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.810 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:31:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.247 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.598 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.641 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.656 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.656 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.658 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.672 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.672 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.673 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.673 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:32:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:00.674 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:32:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7066 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6C4060000000001030307) 
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.122 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.194 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.194 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.392 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.393 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12414MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.394 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.394 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.470 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.471 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.471 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.499 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:32:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59-merged.mount: Deactivated successfully.
Feb 23 09:32:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:32:01 np0005626463.localdomain systemd[1]: tmp-crun.tCDynR.mount: Deactivated successfully.
Feb 23 09:32:01 np0005626463.localdomain podman[249582]: 2026-02-23 09:32:01.81588939 +0000 UTC m=+0.085423133 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Feb 23 09:32:01 np0005626463.localdomain podman[249582]: 2026-02-23 09:32:01.853305802 +0000 UTC m=+0.122839595 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:32:01 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.965 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.972 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.994 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.996 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:32:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:01.997 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:32:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7067 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6CC060000000001030307) 
Feb 23 09:32:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:32:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:32:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:04.177 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:05.645 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22691 DF PROTO=TCP SPT=36906 DPT=9101 SEQ=2180461920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6D8C60000000001030307) 
Feb 23 09:32:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 23 09:32:06 np0005626463.localdomain sshd[249603]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:32:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully.
Feb 23 09:32:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully.
Feb 23 09:32:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:32:07 np0005626463.localdomain podman[249605]: 2026-02-23 09:32:07.064473582 +0000 UTC m=+0.082085322 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:32:07 np0005626463.localdomain podman[249605]: 2026-02-23 09:32:07.073290489 +0000 UTC m=+0.090902229 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 23 09:32:07 np0005626463.localdomain sshd[249603]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:32:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:32:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:32:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:32:08 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:32:09 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:09.215 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27808 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6E5230000000001030307) 
Feb 23 09:32:10 np0005626463.localdomain sshd[249623]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:32:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:32:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 23 09:32:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:32:10 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:10.679 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:10 np0005626463.localdomain podman[249625]: 2026-02-23 09:32:10.724281763 +0000 UTC m=+0.105672955 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:32:10 np0005626463.localdomain podman[249625]: 2026-02-23 09:32:10.76021701 +0000 UTC m=+0.141608192 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:32:10 np0005626463.localdomain podman[249625]: unhealthy
Feb 23 09:32:11 np0005626463.localdomain sshd[249623]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:32:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:11 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:32:11 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:32:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27810 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6F1460000000001030307) 
Feb 23 09:32:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:14 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:14.262 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 23 09:32:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc-merged.mount: Deactivated successfully.
Feb 23 09:32:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7069 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6FC060000000001030307) 
Feb 23 09:32:15 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:15.720 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22693 DF PROTO=TCP SPT=36906 DPT=9101 SEQ=2180461920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF708060000000001030307) 
Feb 23 09:32:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:19.298 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:20 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:20.767 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:21 np0005626463.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 23 09:32:21 np0005626463.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 23 09:32:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21394 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF71EA00000000001030307) 
Feb 23 09:32:24 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:24.333 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff-merged.mount: Deactivated successfully.
Feb 23 09:32:24 np0005626463.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 23 09:32:24 np0005626463.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 23 09:32:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27812 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF722070000000001030307) 
Feb 23 09:32:25 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:25.802 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:32:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:32:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:26 np0005626463.localdomain systemd[1]: tmp-crun.7IT0kB.mount: Deactivated successfully.
Feb 23 09:32:26 np0005626463.localdomain podman[249649]: 2026-02-23 09:32:26.961394259 +0000 UTC m=+0.123475270 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:32:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:27 np0005626463.localdomain podman[249648]: 2026-02-23 09:32:27.006719497 +0000 UTC m=+0.179703371 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Feb 23 09:32:27 np0005626463.localdomain podman[249648]: 2026-02-23 09:32:27.017891609 +0000 UTC m=+0.190875503 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc.)
Feb 23 09:32:27 np0005626463.localdomain podman[249649]: 2026-02-23 09:32:27.10131409 +0000 UTC m=+0.263395091 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:32:27 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:32:27 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:32:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21396 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF72AC60000000001030307) 
Feb 23 09:32:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:32:27 np0005626463.localdomain podman[249689]: 2026-02-23 09:32:27.907982074 +0000 UTC m=+0.079566877 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:32:28 np0005626463.localdomain podman[249689]: 2026-02-23 09:32:28.00541041 +0000 UTC m=+0.176995233 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 09:32:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:29 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:29.377 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:29 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:32:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:30 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:30.853 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:30 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41882 DF PROTO=TCP SPT=49340 DPT=9100 SEQ=2011130389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF739070000000001030307) 
Feb 23 09:32:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:32:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:32 np0005626463.localdomain podman[249712]: 2026-02-23 09:32:32.166234179 +0000 UTC m=+0.092610470 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:32:32 np0005626463.localdomain podman[249712]: 2026-02-23 09:32:32.178547318 +0000 UTC m=+0.104923659 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 23 09:32:32 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:32:32 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41883 DF PROTO=TCP SPT=49340 DPT=9100 SEQ=2011130389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF741060000000001030307) 
Feb 23 09:32:33 np0005626463.localdomain systemd[1]: tmp-crun.u3EDEU.mount: Deactivated successfully.
Feb 23 09:32:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:34 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:34.415 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:32:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully.
Feb 23 09:32:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully.
Feb 23 09:32:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 23 09:32:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 23 09:32:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:35.883 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:36 np0005626463.localdomain sshd[232037]: Received disconnect from 192.168.122.30 port 46748:11: disconnected by user
Feb 23 09:32:36 np0005626463.localdomain sshd[232037]: Disconnected from user zuul 192.168.122.30 port 46748
Feb 23 09:32:36 np0005626463.localdomain sshd[232034]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: session-55.scope: Consumed 1min 21.309s CPU time.
Feb 23 09:32:36 np0005626463.localdomain systemd-logind[759]: Session 55 logged out. Waiting for processes to exit.
Feb 23 09:32:36 np0005626463.localdomain systemd-logind[759]: Removed session 55.
Feb 23 09:32:36 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43571 DF PROTO=TCP SPT=57646 DPT=9101 SEQ=956909870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF74E060000000001030307) 
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 23 09:32:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully.
Feb 23 09:32:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully.
Feb 23 09:32:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 23 09:32:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:32:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:39 np0005626463.localdomain podman[249731]: 2026-02-23 09:32:39.109628964 +0000 UTC m=+0.075818847 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 23 09:32:39 np0005626463.localdomain podman[249731]: 2026-02-23 09:32:39.118097028 +0000 UTC m=+0.084286921 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:32:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21398 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF75A070000000001030307) 
Feb 23 09:32:39 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:39.455 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:39 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:32:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 23 09:32:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully.
Feb 23 09:32:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully.
Feb 23 09:32:40 np0005626463.localdomain sudo[249750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:32:40 np0005626463.localdomain sudo[249750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:32:40 np0005626463.localdomain sudo[249750]: pam_unix(sudo:session): session closed for user root
Feb 23 09:32:40 np0005626463.localdomain sudo[249768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:32:40 np0005626463.localdomain sudo[249768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:32:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:40.926 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:32:42 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47585 DF PROTO=TCP SPT=37598 DPT=9882 SEQ=2607443554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF766460000000001030307) 
Feb 23 09:32:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:32:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:32:42 np0005626463.localdomain podman[249798]: 2026-02-23 09:32:42.756333854 +0000 UTC m=+0.315619003 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:32:42 np0005626463.localdomain sudo[249768]: pam_unix(sudo:session): session closed for user root
Feb 23 09:32:42 np0005626463.localdomain podman[249798]: 2026-02-23 09:32:42.809463854 +0000 UTC m=+0.368748993 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:32:42 np0005626463.localdomain podman[249798]: unhealthy
Feb 23 09:32:42 np0005626463.localdomain sudo[249829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:32:42 np0005626463.localdomain sudo[249829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:32:42 np0005626463.localdomain sudo[249829]: pam_unix(sudo:session): session closed for user root
Feb 23 09:32:42 np0005626463.localdomain sudo[249847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:32:42 np0005626463.localdomain sudo[249847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:32:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:44 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:44.500 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:32:44 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:32:44 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:32:44 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63532 DF PROTO=TCP SPT=45642 DPT=9105 SEQ=2572149185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF770060000000001030307) 
Feb 23 09:32:45 np0005626463.localdomain sshd[249878]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:32:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:45 np0005626463.localdomain sshd[249878]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:32:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:45.971 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:46 np0005626463.localdomain sudo[249847]: pam_unix(sudo:session): session closed for user root
Feb 23 09:32:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:32:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:32:47 np0005626463.localdomain sudo[249899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:32:47 np0005626463.localdomain sudo[249899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:32:47 np0005626463.localdomain sudo[249899]: pam_unix(sudo:session): session closed for user root
Feb 23 09:32:48 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43573 DF PROTO=TCP SPT=57646 DPT=9101 SEQ=956909870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF77E060000000001030307) 
Feb 23 09:32:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:32:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:32:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:32:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:32:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:32:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:32:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:32:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a-merged.mount: Deactivated successfully.
Feb 23 09:32:49 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:49.529 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully.
Feb 23 09:32:51 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:51.023 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully.
Feb 23 09:32:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully.
Feb 23 09:32:52 np0005626463.localdomain sshd[249917]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:32:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:32:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully.
Feb 23 09:32:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully.
Feb 23 09:32:52 np0005626463.localdomain sshd[249917]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.582 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.583 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:32:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:52.597 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:32:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully.
Feb 23 09:32:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:53.606 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully.
Feb 23 09:32:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11511 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF793D00000000001030307) 
Feb 23 09:32:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully.
Feb 23 09:32:54 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:54.568 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47587 DF PROTO=TCP SPT=37598 DPT=9882 SEQ=2607443554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF796060000000001030307) 
Feb 23 09:32:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully.
Feb 23 09:32:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Feb 23 09:32:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Feb 23 09:32:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:55.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:55.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:32:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:56.057 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.135 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82509c2f-e805-495c-861d-404e51ac72ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.131570', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a26e928c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'c47dba8b8c5f859c6255640b65821a2663ccad4f99bb5db509c377f5c9b7d3d6'}]}, 'timestamp': '2026-02-23 09:32:56.136255', '_unique_id': '63494af84f674ee8a5e3207e97cdffdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.138 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.160 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48089054-0a97-46f0-bfdf-d950556e8e39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:32:56.139143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a2725ce6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.349778366, 'message_signature': '5514cbcb4d670468e4180ee8b2cbec6c102b44ca5f8eb73f9e84e26a0fd0c611'}]}, 'timestamp': '2026-02-23 09:32:56.161085', '_unique_id': '835e2340c5814c3888d860587da24f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f28275-b33a-4d1f-9643-955c89427800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.163636', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a278aa42-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'e4a6bf597e5f2b7cc0e7a95e30556a0e19fcba22d2a34c57bc54b7caa8c67f5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.163636', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a278bd34-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '659b51142d252d643b231d2765c78311a316e6f0941c156127315111cd100f94'}]}, 'timestamp': '2026-02-23 09:32:56.202806', '_unique_id': 'd242a90df7844e3c88985212737276ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cc7f603-dc42-48cf-be05-5b4f1b446a1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.205457', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a279378c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'c83ac948f7216d8740034a359478e12e04342e27d098550790f0f65a08832960'}]}, 'timestamp': '2026-02-23 09:32:56.205983', '_unique_id': 'd0261587b5c34adaab474e946814affc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a4a0421-61b0-4f49-9313-9f38aa7ef032', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.208173', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a279a122-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'f77a18813211db7418c8142bd8d57bc97bee09f4662c7714b319de99364d0f6f'}]}, 'timestamp': '2026-02-23 09:32:56.208649', '_unique_id': '9b36e81f8d9e489ebb7ae90211c8ee26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98de4735-6ab7-441f-88ff-44f9fd59004f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.210897', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27a0b8a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '38c1d96aad64be8eff48d9e23ee943647c70fdbb584a4cb7ffe490029721a46e'}]}, 'timestamp': '2026-02-23 09:32:56.211374', '_unique_id': 'c0c4e640ed8045a184185de67f36946b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d9facf0-8fc0-47cc-95cf-5dc0bdf10a57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.215007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27aaba8-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'b139823ebf670036d8325d49339fd15f039caf6dc9dcbfc34e0e71d486861f50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.215007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27abc4c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'ec90a83b9348925c91498b67313d1246074c0d28b7f123d2bdb387f7330ece75'}]}, 'timestamp': '2026-02-23 09:32:56.215972', '_unique_id': '0f2221d6624840f587c38c4acaa19f34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5d472a7-3797-4892-870b-f2a41f1dbbf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.218238', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27b2a24-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '5327072ff96b050139a8f2e4045b069461798c5395a10ae6190be45304bc8325'}]}, 'timestamp': '2026-02-23 09:32:56.218710', '_unique_id': '82d0e217d9c04e7cb3d3661eb9177fd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 54850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46fe0a86-5686-48c1-8726-e9251a11e9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54850000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:32:56.220850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a27b90fe-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.349778366, 'message_signature': '4d982532e70ad717e08cf7161ad76e9a3dcb928f516ad548084f0c5f1222625f'}]}, 'timestamp': '2026-02-23 09:32:56.221387', '_unique_id': '7b25c7938bb04691a862e26955dbd069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fed521d-8876-4e23-b4d9-702b95028ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.222819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27bda32-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '9370494d269c337a833696da33e53f2316a4414631f51d323f481e7db5f0054c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.222819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27be482-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '71a49b91119e58ff3dcb93d6441573d6b43c1445f9e52e2e39d0bc341771979a'}]}, 'timestamp': '2026-02-23 09:32:56.223373', '_unique_id': '250e6282ebb643c9b08f093e440e117b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac153be-d491-4c6a-87c6-9c530827c254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.225025', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27c3022-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '8c59fc56120c36a93fc7af745a1b57b3f4d31e5e1893d388e8269be0156b332b'}]}, 'timestamp': '2026-02-23 09:32:56.225346', '_unique_id': '72837f4c423d431ab39d1de48a05e1ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3518902b-941f-4268-b996-2dc983a73687', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.226721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27c71ae-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '0a7f005c30e1c3823a3f6f5f3a4aff25b4c93d4c1a3f47a268652f3788dbdf5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.226721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27c7f00-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '27554210fe8367c1f7e87fe4fab29b4c2cf2400df01efd0bedfe25fc65b8b99f'}]}, 'timestamp': '2026-02-23 09:32:56.227331', '_unique_id': '58802dacd1d847459381d3df8b842f43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02ab430f-bbc8-4a3e-a05e-75993a7f3ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.228893', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27cc712-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '08bea3778b74c74d39939389afe848543247de210a974f913bcf3d68e8921cf9'}]}, 'timestamp': '2026-02-23 09:32:56.229244', '_unique_id': '729f9553263046168c035e5e7fa51893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c79dd5a8-3c65-489e-a26c-0e49333f99da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.230629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27eb9b4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '9c04206439ca020ec4362d22f4dff51e20818590131a0e1aeff1f24b8f96216d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.230629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27ec77e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '100cb7115210309e076305fd67a0a0dbd066c58b55f5a6e131b55250a16ae8d8'}]}, 'timestamp': '2026-02-23 09:32:56.242298', '_unique_id': '2ac369556d6c4213b3cd70a7b6932231'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '318e6f0e-b0c1-4d2f-bb26-a1f4fa608f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.243786', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27f0d60-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'f927672e800aed80538a7c9d9a1f9acab5ffe8a1a20af62aeef349a8d27cf99b'}]}, 'timestamp': '2026-02-23 09:32:56.244152', '_unique_id': 'aa9b7731cd3547c79f3b982859a5f1b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4726e68f-19a2-4c55-932d-36e28fb4c465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.245566', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27f51ee-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '6d0e3797b058f605dc23aeaf82ab79e60f7351331d2ffa836df5fe7cfec3bdd9'}]}, 'timestamp': '2026-02-23 09:32:56.245854', '_unique_id': 'c87de322572844efb22414db4d6baa4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91508e55-2e52-4700-be7c-4120bd10b950', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.247302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27f95be-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '78946f3cf075e985d5253d0f882d6a39446fe52966df0a15ca584b22d8e0d471'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.247302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27f9ff0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '0e5fec729ccc81281612885db55eafe81772eae24529bebd390514a09ba1b04c'}]}, 'timestamp': '2026-02-23 09:32:56.247837', '_unique_id': '5db49799673e42f6a3caa4e407b1a6c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c12a4ca-dc99-4396-be35-991695851a84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.249426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27fe898-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '542d2452a1ad0e2c06d66692588e3d75e17426ce3665022292db4fa65c56da1f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.249426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27ff2de-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '4e28dc28a09f491716ce0e67b3037c681589cfa4fbec4c34c6dd0c8192a690c7'}]}, 'timestamp': '2026-02-23 09:32:56.249972', '_unique_id': 'eee84fb3880d4b6fa777d98a2c5f0451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd4dcd49-4f6f-4b8d-8b9f-0e816bbe595f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.251420', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a28036ae-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '759dfeda47d57c7488cfc1a1934aa76726d12f4e881979c3d3747dbb88d5f38a'}]}, 'timestamp': '2026-02-23 09:32:56.251709', '_unique_id': '3bc5765445cd474684408787244f6a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86c57273-93f4-45bc-ab8c-5150032ab6eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.253267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a2808186-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '7488b9ed292b5f6a6af0f712ed2ad658a0f5c61a203ebb65800386e710b63119'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.253267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a2808bb8-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '53942686608a4787b8f768d66bed00ba36cc45c52219b3b0ba1a6db6c3e3c664'}]}, 'timestamp': '2026-02-23 09:32:56.253883', '_unique_id': 'cb94a85990194bbdb98bebbba820c892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aec2b2b-8ae6-45c1-a5c8-3fed6884de57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.255285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a280cfba-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '61b24fe090710890b0d007468b77dee1dd3ded04371e1295c75e10046c36b22f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.255285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a280da28-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '20ca10a6e14d59b59e9b513e0b895df4e319bb058940124da8204ab2cdce2db5'}]}, 'timestamp': '2026-02-23 09:32:56.255897', '_unique_id': '2d993f22ced94bd28e6f4720b7c53dfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:32:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:32:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Feb 23 09:32:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Feb 23 09:32:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:56.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:56.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11513 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF79FC70000000001030307) 
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:32:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:57.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:32:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:32:57 np0005626463.localdomain podman[249919]: 2026-02-23 09:32:57.644250353 +0000 UTC m=+0.086703949 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, build-date=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal)
Feb 23 09:32:57 np0005626463.localdomain podman[249919]: 2026-02-23 09:32:57.65711673 +0000 UTC m=+0.099570246 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, name=ubi9/ubi-minimal)
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:32:57 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:32:57 np0005626463.localdomain podman[249920]: 2026-02-23 09:32:57.750013578 +0000 UTC m=+0.191416599 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:32:57 np0005626463.localdomain podman[249920]: 2026-02-23 09:32:57.757322184 +0000 UTC m=+0.198725265 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:32:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:58.391 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:32:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:32:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:32:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.418 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.442 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:32:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:32:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:32:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:32:59.595 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:32:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:32:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:32:59 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:32:59 np0005626463.localdomain podman[249960]: 2026-02-23 09:32:59.918246467 +0000 UTC m=+0.241020376 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:32:59 np0005626463.localdomain podman[249960]: 2026-02-23 09:32:59.983659045 +0000 UTC m=+0.306432914 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:33:00 np0005626463.localdomain sshd[249984]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:33:00 np0005626463.localdomain sshd[249984]: Accepted publickey for zuul from 192.168.122.30 port 57694 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:33:00 np0005626463.localdomain systemd-logind[759]: New session 56 of user zuul.
Feb 23 09:33:00 np0005626463.localdomain systemd[1]: Started Session 56 of User zuul.
Feb 23 09:33:00 np0005626463.localdomain sshd[249984]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.439 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:00 np0005626463.localdomain sudo[250078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wslmihwuqmdyvhhnxnyukhanagqsoawo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839180.2317283-3687-81237400079641/AnsiballZ_file.py
Feb 23 09:33:00 np0005626463.localdomain sudo[250078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.566 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:33:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:00.566 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:33:00 np0005626463.localdomain python3.9[250080]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:00 np0005626463.localdomain sudo[250078]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:00 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45735 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7AE460000000001030307) 
Feb 23 09:33:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:00 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.100 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.119 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:33:01 np0005626463.localdomain sudo[250210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okcwznzfexhgobyojymmdhomrtadhtwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839180.8847115-3714-46584600479295/AnsiballZ_stat.py
Feb 23 09:33:01 np0005626463.localdomain sudo[250210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.182 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.183 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:33:01 np0005626463.localdomain python3.9[250212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:01 np0005626463.localdomain sudo[250210]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.364 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.366 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12483MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.367 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.368 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.501 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.501 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.502 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.553 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.621 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.621 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.641 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.665 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:33:01 np0005626463.localdomain sudo[250298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqmshixxogvljwwzxloxsxhykqbcpxpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839180.8847115-3714-46584600479295/AnsiballZ_copy.py
Feb 23 09:33:01 np0005626463.localdomain sudo[250298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:01.714 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:33:01 np0005626463.localdomain python3.9[250300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839180.8847115-3714-46584600479295/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:01 np0005626463.localdomain sudo[250298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:02.148 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:33:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:02.151 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:33:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:02.169 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:33:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:02.172 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:33:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:02.173 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:33:02 np0005626463.localdomain sudo[250430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfbruzqgqhcnlwuzmstgipoiwarzcyme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839182.2719655-3762-196333737094511/AnsiballZ_file.py
Feb 23 09:33:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:33:02 np0005626463.localdomain sudo[250430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:02 np0005626463.localdomain podman[250432]: 2026-02-23 09:33:02.66217478 +0000 UTC m=+0.095026459 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 23 09:33:02 np0005626463.localdomain podman[250432]: 2026-02-23 09:33:02.699098685 +0000 UTC m=+0.131950334 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:33:02 np0005626463.localdomain python3.9[250433]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:02 np0005626463.localdomain sudo[250430]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45736 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7B6460000000001030307) 
Feb 23 09:33:03 np0005626463.localdomain sudo[250559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joiqogvbfvowooibgznvodfcssexkvns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839183.0216036-3786-100665540874067/AnsiballZ_stat.py
Feb 23 09:33:03 np0005626463.localdomain sudo[250559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:03 np0005626463.localdomain python3.9[250561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:03 np0005626463.localdomain sudo[250559]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:03 np0005626463.localdomain sudo[250616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hekjumkighourjxfuxvaomsusqkysmbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839183.0216036-3786-100665540874067/AnsiballZ_file.py
Feb 23 09:33:03 np0005626463.localdomain sudo[250616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:33:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully.
Feb 23 09:33:04 np0005626463.localdomain python3.9[250618]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:04 np0005626463.localdomain sudo[250616]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully.
Feb 23 09:33:04 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:33:04 np0005626463.localdomain sudo[250726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uchtgzfodhylwjctqftgbtscdyuifzgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839184.3473322-3822-173021205140458/AnsiballZ_stat.py
Feb 23 09:33:04 np0005626463.localdomain sudo[250726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:04.632 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:04 np0005626463.localdomain python3.9[250728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:04 np0005626463.localdomain sudo[250726]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:05 np0005626463.localdomain sudo[250783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohsardswtqtahbzcvtfvhjvupchvoeyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839184.3473322-3822-173021205140458/AnsiballZ_file.py
Feb 23 09:33:05 np0005626463.localdomain sudo[250783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:05 np0005626463.localdomain python3.9[250785]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.0asdu0sn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:05 np0005626463.localdomain sudo[250783]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:06.144 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:06 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29526 DF PROTO=TCP SPT=35476 DPT=9101 SEQ=1186504611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7C3460000000001030307) 
Feb 23 09:33:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:06 np0005626463.localdomain sudo[250893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbhvodbycttipthrzshmeojcnmzhnhcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839185.4943533-3858-115865597506556/AnsiballZ_stat.py
Feb 23 09:33:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:06 np0005626463.localdomain sudo[250893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:06 np0005626463.localdomain python3.9[250895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:06 np0005626463.localdomain sudo[250893]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:07 np0005626463.localdomain sudo[250950]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifihbhrzuihgqwrrtpstiqvsenwuxlez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839185.4943533-3858-115865597506556/AnsiballZ_file.py
Feb 23 09:33:07 np0005626463.localdomain sudo[250950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:07 np0005626463.localdomain python3.9[250952]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:07 np0005626463.localdomain sudo[250950]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:07 np0005626463.localdomain sudo[251060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkesctwycnmjzrmhododoplrlmqoqwlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839187.556368-3897-114768097892953/AnsiballZ_command.py
Feb 23 09:33:07 np0005626463.localdomain sudo[251060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:08 np0005626463.localdomain python3.9[251062]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:33:08 np0005626463.localdomain sudo[251060]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:09 np0005626463.localdomain sudo[251171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmvdztlhtuiihxgzdflkifnzdwvrmofo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771839188.760952-3921-90920752978164/AnsiballZ_edpm_nftables_from_files.py
Feb 23 09:33:09 np0005626463.localdomain sudo[251171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37973 DF PROTO=TCP SPT=54942 DPT=9882 SEQ=2228898416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7CF830000000001030307) 
Feb 23 09:33:09 np0005626463.localdomain python3[251173]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 23 09:33:09 np0005626463.localdomain sudo[251171]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:33:09 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:09.662 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:09 np0005626463.localdomain systemd[1]: tmp-crun.W7nxyE.mount: Deactivated successfully.
Feb 23 09:33:09 np0005626463.localdomain podman[251191]: 2026-02-23 09:33:09.682857747 +0000 UTC m=+0.109344953 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible)
Feb 23 09:33:09 np0005626463.localdomain podman[251191]: 2026-02-23 09:33:09.693383928 +0000 UTC m=+0.119871174 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:33:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain sudo[251298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldcvapgkqjbdmwntawtfgjdvmgbowodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839189.83892-3945-18710003047891/AnsiballZ_stat.py
Feb 23 09:33:10 np0005626463.localdomain sudo[251298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:10 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain python3.9[251300]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:10 np0005626463.localdomain sudo[251298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain sudo[251355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvexvjsezsbgiuevhcwtlyuahdpzzhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839189.83892-3945-18710003047891/AnsiballZ_file.py
Feb 23 09:33:10 np0005626463.localdomain sudo[251355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain python3.9[251357]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:10 np0005626463.localdomain sudo[251355]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:11 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:11.192 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:11 np0005626463.localdomain sudo[251465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhkgkrwkbmheluhnsvsnpxaucqrnaypf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839191.029198-3981-38888061266288/AnsiballZ_stat.py
Feb 23 09:33:11 np0005626463.localdomain sudo[251465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:11 np0005626463.localdomain python3.9[251467]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:11 np0005626463.localdomain sudo[251465]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:11 np0005626463.localdomain sudo[251522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvujgmmapsflkdjfotbxqqjlmowagvlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839191.029198-3981-38888061266288/AnsiballZ_file.py
Feb 23 09:33:11 np0005626463.localdomain sudo[251522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:12 np0005626463.localdomain python3.9[251524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:12 np0005626463.localdomain sudo[251522]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:12 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37975 DF PROTO=TCP SPT=54942 DPT=9882 SEQ=2228898416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7DB860000000001030307) 
Feb 23 09:33:12 np0005626463.localdomain sudo[251632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnsbxokihevdzovqwwpwikghqqjbbmms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839192.267134-4017-95493868439888/AnsiballZ_stat.py
Feb 23 09:33:12 np0005626463.localdomain sudo[251632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:12 np0005626463.localdomain python3.9[251634]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:12 np0005626463.localdomain sudo[251632]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:12 np0005626463.localdomain sudo[251689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luxsnbxncznmujkqtahznvefzwqgicuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839192.267134-4017-95493868439888/AnsiballZ_file.py
Feb 23 09:33:12 np0005626463.localdomain sudo[251689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:13 np0005626463.localdomain python3.9[251691]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:13 np0005626463.localdomain sudo[251689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1-merged.mount: Deactivated successfully.
Feb 23 09:33:13 np0005626463.localdomain sudo[251799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvzldxgvvzkfqrdwndaocczndoftyrlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839193.4483774-4053-60709891337311/AnsiballZ_stat.py
Feb 23 09:33:13 np0005626463.localdomain sudo[251799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:13 np0005626463.localdomain python3.9[251801]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:13 np0005626463.localdomain sudo[251799]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:14 np0005626463.localdomain sudo[251856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcicrlmyzoexrrqykdgvzmnsqysgptrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839193.4483774-4053-60709891337311/AnsiballZ_file.py
Feb 23 09:33:14 np0005626463.localdomain sudo[251856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:14 np0005626463.localdomain python3.9[251858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:14 np0005626463.localdomain sudo[251856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:14 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:14.692 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:33:14 np0005626463.localdomain sudo[251975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kykejqpvsbdowwbdbqjfcrugtjdveukg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839194.5352535-4089-178288335353392/AnsiballZ_stat.py
Feb 23 09:33:14 np0005626463.localdomain sudo[251975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:14 np0005626463.localdomain podman[251951]: 2026-02-23 09:33:14.904638526 +0000 UTC m=+0.075673222 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:33:14 np0005626463.localdomain podman[251951]: 2026-02-23 09:33:14.912539182 +0000 UTC m=+0.083573848 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:33:14 np0005626463.localdomain podman[251951]: unhealthy
Feb 23 09:33:15 np0005626463.localdomain python3.9[251979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:15 np0005626463.localdomain sudo[251975]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:15 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45738 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7E6060000000001030307) 
Feb 23 09:33:15 np0005626463.localdomain sudo[252077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrdhymqtuzahyllitvaxnwvwquqtokrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839194.5352535-4089-178288335353392/AnsiballZ_copy.py
Feb 23 09:33:15 np0005626463.localdomain sudo[252077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:15 np0005626463.localdomain python3.9[252079]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771839194.5352535-4089-178288335353392/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:15 np0005626463.localdomain sudo[252077]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:16 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:33:16 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:33:16 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:16.234 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:16 np0005626463.localdomain sudo[252187]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbznulfqbqviiviwhjcsdnqwokmhbygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839196.1064074-4134-138008097583835/AnsiballZ_file.py
Feb 23 09:33:16 np0005626463.localdomain sudo[252187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:16 np0005626463.localdomain python3.9[252189]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:16 np0005626463.localdomain sudo[252187]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:17 np0005626463.localdomain sudo[252297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juoksizjoltqnnxkquqtcoissbpatmft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839196.8097231-4158-241380650722915/AnsiballZ_command.py
Feb 23 09:33:17 np0005626463.localdomain sudo[252297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:17 np0005626463.localdomain python3.9[252299]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:33:17 np0005626463.localdomain sudo[252297]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:18 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29528 DF PROTO=TCP SPT=35476 DPT=9101 SEQ=1186504611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7F4070000000001030307) 
Feb 23 09:33:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:19 np0005626463.localdomain sudo[252410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnpjpslzzuxwttxwyujwzspqmzckwzah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839198.651641-4182-144104188225248/AnsiballZ_blockinfile.py
Feb 23 09:33:19 np0005626463.localdomain sudo[252410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:19 np0005626463.localdomain python3.9[252412]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:19 np0005626463.localdomain sudo[252410]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:19.736 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:20 np0005626463.localdomain sudo[252520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyfuqvwedgxajnsxclmagzkvtdpjiwgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839199.573503-4209-183797761490081/AnsiballZ_command.py
Feb 23 09:33:20 np0005626463.localdomain sudo[252520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:20 np0005626463.localdomain python3.9[252522]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:33:20 np0005626463.localdomain sudo[252520]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:21 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:21.270 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:21 np0005626463.localdomain sudo[252631]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpvtwrznoykbyasfoglffpttqupgrwku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839201.3018034-4233-177547259898221/AnsiballZ_stat.py
Feb 23 09:33:21 np0005626463.localdomain sudo[252631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:21 np0005626463.localdomain python3.9[252633]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:33:21 np0005626463.localdomain sudo[252631]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:22 np0005626463.localdomain sudo[252743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugmuhmgykwlczzluiauxbysxjefchfmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839201.9538095-4257-39036717730436/AnsiballZ_command.py
Feb 23 09:33:22 np0005626463.localdomain sudo[252743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:22 np0005626463.localdomain python3.9[252745]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:33:22 np0005626463.localdomain sudo[252743]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876-merged.mount: Deactivated successfully.
Feb 23 09:33:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876-merged.mount: Deactivated successfully.
Feb 23 09:33:22 np0005626463.localdomain sudo[252856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dihgqubdodrsrqamjzmtkjggcnnznqaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839202.6606019-4281-271747356330182/AnsiballZ_file.py
Feb 23 09:33:22 np0005626463.localdomain sudo[252856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:23 np0005626463.localdomain python3.9[252858]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:23 np0005626463.localdomain sudo[252856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:23 np0005626463.localdomain sshd[249984]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:33:23 np0005626463.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Feb 23 09:33:23 np0005626463.localdomain systemd[1]: session-56.scope: Consumed 12.634s CPU time.
Feb 23 09:33:23 np0005626463.localdomain systemd-logind[759]: Session 56 logged out. Waiting for processes to exit.
Feb 23 09:33:23 np0005626463.localdomain systemd-logind[759]: Removed session 56.
Feb 23 09:33:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49492 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF809010000000001030307) 
Feb 23 09:33:24 np0005626463.localdomain sshd[252876]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:33:24 np0005626463.localdomain sshd[252876]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:33:24 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:24.779 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49493 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF80D060000000001030307) 
Feb 23 09:33:25 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:26 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:26.328 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:27 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:27 np0005626463.localdomain sshd[252878]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:33:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49494 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF815060000000001030307) 
Feb 23 09:33:27 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:33:27 np0005626463.localdomain podman[252880]: 2026-02-23 09:33:27.917640536 +0000 UTC m=+0.090084730 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:33:27 np0005626463.localdomain podman[252880]: 2026-02-23 09:33:27.932134005 +0000 UTC m=+0.104578239 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z)
Feb 23 09:33:27 np0005626463.localdomain sshd[252878]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:33:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:28 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:28 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:33:28 np0005626463.localdomain sshd[252900]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:33:28 np0005626463.localdomain sshd[252900]: Accepted publickey for zuul from 192.168.122.30 port 38912 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:33:28 np0005626463.localdomain systemd-logind[759]: New session 57 of user zuul.
Feb 23 09:33:28 np0005626463.localdomain systemd[1]: Started Session 57 of User zuul.
Feb 23 09:33:28 np0005626463.localdomain sshd[252900]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:33:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:29 np0005626463.localdomain sudo[253011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eosdomddxihugmmreseyxurcetyhanep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839209.0754712-21-175698621673973/AnsiballZ_file.py
Feb 23 09:33:29 np0005626463.localdomain sudo[253011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:29 np0005626463.localdomain python3.9[253013]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:29 np0005626463.localdomain sudo[253011]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:29 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:29.782 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:33:29 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:30 np0005626463.localdomain systemd[1]: tmp-crun.77ZSRe.mount: Deactivated successfully.
Feb 23 09:33:30 np0005626463.localdomain podman[253068]: 2026-02-23 09:33:30.023073809 +0000 UTC m=+0.095454945 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:33:30 np0005626463.localdomain podman[253068]: 2026-02-23 09:33:30.037993667 +0000 UTC m=+0.110374833 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:33:30 np0005626463.localdomain sudo[253144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eexgswdxqukvlnxczfdvypywuhsxazhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839209.8759353-21-105397707042584/AnsiballZ_file.py
Feb 23 09:33:30 np0005626463.localdomain sudo[253144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:30 np0005626463.localdomain python3.9[253146]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:30 np0005626463.localdomain sudo[253144]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:30 np0005626463.localdomain sudo[253254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rarihyamnecztkxmcbmpsluhxdxcfnaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839210.5048015-21-252911499323452/AnsiballZ_file.py
Feb 23 09:33:30 np0005626463.localdomain sudo[253254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:30 np0005626463.localdomain python3.9[253256]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:30 np0005626463.localdomain sudo[253254]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49495 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF824C60000000001030307) 
Feb 23 09:33:31 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:31.367 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:33:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully.
Feb 23 09:33:31 np0005626463.localdomain python3.9[253364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:31 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully.
Feb 23 09:33:31 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:33:31 np0005626463.localdomain podman[253365]: 2026-02-23 09:33:31.950242857 +0000 UTC m=+0.245837157 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 09:33:31 np0005626463.localdomain podman[253365]: 2026-02-23 09:33:31.990265798 +0000 UTC m=+0.285860078 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 23 09:33:32 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:33:32 np0005626463.localdomain python3.9[253475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839211.1572425-99-148851394594912/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Feb 23 09:33:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Feb 23 09:33:33 np0005626463.localdomain python3.9[253583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:33 np0005626463.localdomain python3.9[253669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839212.6122553-99-114132814644498/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:34 np0005626463.localdomain python3.9[253777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:33:34 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:34 np0005626463.localdomain podman[253847]: 2026-02-23 09:33:34.431113574 +0000 UTC m=+0.082869073 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Feb 23 09:33:34 np0005626463.localdomain podman[253847]: 2026-02-23 09:33:34.441278389 +0000 UTC m=+0.093033908 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216)
Feb 23 09:33:34 np0005626463.localdomain python3.9[253871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839213.7100668-99-105472800931003/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=840467536a035a46ebab3aa34cac1ebe80e50e31 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:34 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:33:34 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:34.822 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:35 np0005626463.localdomain python3.9[253991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:36 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:36.409 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:36 np0005626463.localdomain python3.9[254077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839215.3850489-273-115786416178742/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=d10c6f671263070bdc94fed977552f121764373c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:37 np0005626463.localdomain python3.9[254185]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:33:37 np0005626463.localdomain sudo[254295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izpnvutxwiwkedeezrgebznfuqhfpcnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839217.3029413-345-139222052203296/AnsiballZ_file.py
Feb 23 09:33:37 np0005626463.localdomain sudo[254295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:37 np0005626463.localdomain python3.9[254297]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:37 np0005626463.localdomain sudo[254295]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:37 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:38 np0005626463.localdomain sudo[254405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfzmzfkxfrqgdczqhbnkjyrenheahxuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839218.008342-369-255516309191567/AnsiballZ_stat.py
Feb 23 09:33:38 np0005626463.localdomain sudo[254405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:38 np0005626463.localdomain python3.9[254407]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:38 np0005626463.localdomain sudo[254405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:39 np0005626463.localdomain sudo[254462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwjpdcfsamegycvaqmgpapidduuixbtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839218.008342-369-255516309191567/AnsiballZ_file.py
Feb 23 09:33:39 np0005626463.localdomain sudo[254462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:39 np0005626463.localdomain python3.9[254464]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:39 np0005626463.localdomain sudo[254462]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:39 np0005626463.localdomain sudo[254572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdmsieqtpdgkngtdgpntwghmsjwnyqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839219.4128668-369-239560753155269/AnsiballZ_stat.py
Feb 23 09:33:39 np0005626463.localdomain sudo[254572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49496 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF846060000000001030307) 
Feb 23 09:33:39 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:39.860 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:39 np0005626463.localdomain python3.9[254574]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:39 np0005626463.localdomain sudo[254572]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:40 np0005626463.localdomain sudo[254629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfxjiygbszwtwtrktvsbtliqwqcstbgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839219.4128668-369-239560753155269/AnsiballZ_file.py
Feb 23 09:33:40 np0005626463.localdomain sudo[254629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:40 np0005626463.localdomain python3.9[254631]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:40 np0005626463.localdomain sudo[254629]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:33:40 np0005626463.localdomain systemd[1]: tmp-crun.4aYnew.mount: Deactivated successfully.
Feb 23 09:33:40 np0005626463.localdomain podman[254632]: 2026-02-23 09:33:40.918849797 +0000 UTC m=+0.098030878 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent)
Feb 23 09:33:40 np0005626463.localdomain podman[254632]: 2026-02-23 09:33:40.951330456 +0000 UTC m=+0.130511547 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:33:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 23 09:33:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde-merged.mount: Deactivated successfully.
Feb 23 09:33:41 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:33:41 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:41.452 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:41 np0005626463.localdomain sudo[254757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snwqpqjqvttgeyvcymdjimyvqjqdhtmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839221.3974335-438-274840035913066/AnsiballZ_file.py
Feb 23 09:33:41 np0005626463.localdomain sudo[254757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:41 np0005626463.localdomain python3.9[254759]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:41 np0005626463.localdomain sudo[254757]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:42 np0005626463.localdomain sudo[254867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbveripzurtbyliewvvgupmfhbonprbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839222.0898871-462-127067948096854/AnsiballZ_stat.py
Feb 23 09:33:42 np0005626463.localdomain sudo[254867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:42 np0005626463.localdomain python3.9[254869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:42 np0005626463.localdomain sudo[254867]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:42 np0005626463.localdomain sudo[254924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-seojempagynprhjloobdbpjqhvgpqmvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839222.0898871-462-127067948096854/AnsiballZ_file.py
Feb 23 09:33:42 np0005626463.localdomain sudo[254924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:42 np0005626463.localdomain python3.9[254926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:43 np0005626463.localdomain sudo[254924]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:33:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:33:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:33:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:33:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:33:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:33:43 np0005626463.localdomain sudo[255038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwnjbvfxcsacnxttehpslmthjgighoqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839223.2083073-498-36769149517229/AnsiballZ_stat.py
Feb 23 09:33:43 np0005626463.localdomain sudo[255038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:43 np0005626463.localdomain python3.9[255040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:43 np0005626463.localdomain sudo[255038]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:44 np0005626463.localdomain sudo[255095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdtcnrgnxrklpwcpmmbprbhlzindswnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839223.2083073-498-36769149517229/AnsiballZ_file.py
Feb 23 09:33:44 np0005626463.localdomain sudo[255095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:44 np0005626463.localdomain python3.9[255097]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:44 np0005626463.localdomain sudo[255095]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:44 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:44.905 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:44 np0005626463.localdomain sudo[255205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bojbjmeuwtuhxozstwqqxyipcpduikvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839224.3921359-534-262716788351616/AnsiballZ_systemd.py
Feb 23 09:33:44 np0005626463.localdomain sudo[255205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:45 np0005626463.localdomain python3.9[255207]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:33:45 np0005626463.localdomain systemd-sysv-generator[255236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:33:45 np0005626463.localdomain systemd-rc-local-generator[255230]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:45 np0005626463.localdomain sudo[255205]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 23 09:33:46 np0005626463.localdomain sudo[255352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sygigeebcznfbmjhuiacuceytqffsqii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839225.8840284-558-86869513990127/AnsiballZ_stat.py
Feb 23 09:33:46 np0005626463.localdomain sudo[255352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:33:46 np0005626463.localdomain podman[255354]: 2026-02-23 09:33:46.263889325 +0000 UTC m=+0.090285121 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:33:46 np0005626463.localdomain podman[255354]: 2026-02-23 09:33:46.272783569 +0000 UTC m=+0.099179385 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:33:46 np0005626463.localdomain podman[255354]: unhealthy
Feb 23 09:33:46 np0005626463.localdomain python3.9[255355]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:46 np0005626463.localdomain sudo[255352]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:46.503 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:46 np0005626463.localdomain sudo[255433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edlvidfudqgdcmsziwagrznppgcnzrpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839225.8840284-558-86869513990127/AnsiballZ_file.py
Feb 23 09:33:46 np0005626463.localdomain sudo[255433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:46 np0005626463.localdomain python3.9[255435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:46 np0005626463.localdomain sudo[255433]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'.
Feb 23 09:33:47 np0005626463.localdomain sudo[255543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqtmbbheozxfgstbwoytfsihotvhkckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839227.0965035-594-115948004360933/AnsiballZ_stat.py
Feb 23 09:33:47 np0005626463.localdomain sudo[255543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:47 np0005626463.localdomain python3.9[255545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:47 np0005626463.localdomain sudo[255543]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:47 np0005626463.localdomain sudo[255600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbwcifndnhtiulvogadpcikhezertkwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839227.0965035-594-115948004360933/AnsiballZ_file.py
Feb 23 09:33:47 np0005626463.localdomain sudo[255600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:47 np0005626463.localdomain sudo[255603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:33:47 np0005626463.localdomain sudo[255603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:33:47 np0005626463.localdomain sudo[255603]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:47 np0005626463.localdomain sudo[255621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:33:48 np0005626463.localdomain sudo[255621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:33:48 np0005626463.localdomain python3.9[255602]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:48 np0005626463.localdomain sudo[255600]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:33:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:33:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:33:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:33:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:33:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:33:48 np0005626463.localdomain sudo[255756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocgfycmzzafbtzgssiqdqksfjkjwszhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839228.2599256-630-221084040751697/AnsiballZ_systemd.py
Feb 23 09:33:48 np0005626463.localdomain sudo[255756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:48 np0005626463.localdomain python3.9[255758]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:33:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:33:48 np0005626463.localdomain systemd-rc-local-generator[255781]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:33:48 np0005626463.localdomain systemd-sysv-generator[255786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:33:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 09:33:49 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 09:33:49 np0005626463.localdomain sudo[255756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:49 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:49.964 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:50 np0005626463.localdomain sudo[255908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yptoucbrbjpnrzxitmoakxappsmaulhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839229.875344-660-119147184847892/AnsiballZ_file.py
Feb 23 09:33:50 np0005626463.localdomain sudo[255908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:50 np0005626463.localdomain python3.9[255910]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 23 09:33:50 np0005626463.localdomain sudo[255908]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully.
Feb 23 09:33:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully.
Feb 23 09:33:50 np0005626463.localdomain sudo[256020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyykchljxxvdbhzwvdvihsolqlpgeiko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839230.5571303-684-212419491637393/AnsiballZ_file.py
Feb 23 09:33:50 np0005626463.localdomain sudo[256020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:51 np0005626463.localdomain python3.9[256022]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:33:51 np0005626463.localdomain sudo[256020]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:51 np0005626463.localdomain sudo[255621]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:51 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:51.550 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:33:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:33:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:33:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:52 np0005626463.localdomain sudo[256147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbgozwjxqjrpktxrljznpkpqwkedngex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839232.3000507-708-67095203957197/AnsiballZ_stat.py
Feb 23 09:33:52 np0005626463.localdomain sudo[256147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:52 np0005626463.localdomain sudo[256150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:33:52 np0005626463.localdomain sudo[256150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:33:52 np0005626463.localdomain sudo[256150]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:52 np0005626463.localdomain python3.9[256149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:33:52 np0005626463.localdomain sudo[256147]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:53 np0005626463.localdomain sudo[256253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-domkggxynzqogxcwhbifwvcfaesngorj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839232.3000507-708-67095203957197/AnsiballZ_copy.py
Feb 23 09:33:53 np0005626463.localdomain sudo[256253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:53 np0005626463.localdomain python3.9[256255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839232.3000507-708-67095203957197/.source.json _original_basename=.w2yffn9v follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:53 np0005626463.localdomain sudo[256253]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 23 09:33:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully.
Feb 23 09:33:53 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully.
Feb 23 09:33:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47049 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF87E300000000001030307) 
Feb 23 09:33:54 np0005626463.localdomain python3.9[256363]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:33:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:55.010 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47050 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF882470000000001030307) 
Feb 23 09:33:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:33:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:33:55 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:33:56 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49497 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF886070000000001030307) 
Feb 23 09:33:56 np0005626463.localdomain sudo[256665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mncmjpuulndisgukgaiivadyetqvodhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839236.1145868-828-118176673392589/AnsiballZ_container_config_data.py
Feb 23 09:33:56 np0005626463.localdomain sudo[256665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:56.596 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:33:56 np0005626463.localdomain python3.9[256667]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Feb 23 09:33:56 np0005626463.localdomain sudo[256665]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.174 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.175 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.175 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:33:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47051 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF88A460000000001030307) 
Feb 23 09:33:57 np0005626463.localdomain sudo[256775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hboupmwbosvpgavvetwvsfikxgvhzlzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839237.0851681-861-201410199905153/AnsiballZ_container_config_hash.py
Feb 23 09:33:57 np0005626463.localdomain sudo[256775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:33:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:57.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:33:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:33:57 np0005626463.localdomain python3.9[256777]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:33:57 np0005626463.localdomain sudo[256775]: pam_unix(sudo:session): session closed for user root
Feb 23 09:33:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 23 09:33:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11517 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF88E060000000001030307) 
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.323 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.324 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.324 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.325 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:33:58 np0005626463.localdomain sudo[256885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlkfwpcfwcoeqsaqhkbzfezisvwhnufd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771839238.0446644-891-55387178964654/AnsiballZ_edpm_container_manage.py
Feb 23 09:33:58 np0005626463.localdomain sudo[256885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:33:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:33:58 np0005626463.localdomain systemd[1]: tmp-crun.yOUHvk.mount: Deactivated successfully.
Feb 23 09:33:58 np0005626463.localdomain podman[256888]: 2026-02-23 09:33:58.692807751 +0000 UTC m=+0.108716020 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container)
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.730 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:33:58 np0005626463.localdomain podman[256888]: 2026-02-23 09:33:58.733478043 +0000 UTC m=+0.149386312 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc.)
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:58.749 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:58 np0005626463.localdomain python3[256887]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:33:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 23 09:33:59 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:33:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:59.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:59.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:33:59.563 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:33:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:33:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:34:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:00.013 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:34:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47052 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF89A070000000001030307) 
Feb 23 09:34:01 np0005626463.localdomain sshd[256931]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:01.600 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:01 np0005626463.localdomain sshd[256931]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:34:02 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 23 09:34:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:34:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:34:02 np0005626463.localdomain podman[256934]: 2026-02-23 09:34:02.201601651 +0000 UTC m=+0.106128987 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:34:02 np0005626463.localdomain podman[256934]: 2026-02-23 09:34:02.211207758 +0000 UTC m=+0.115735114 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:34:02 np0005626463.localdomain podman[256933]: 2026-02-23 09:34:02.2975156 +0000 UTC m=+0.204204356 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 09:34:02 np0005626463.localdomain podman[256933]: 2026-02-23 09:34:02.354786432 +0000 UTC m=+0.261475258 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.555 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:34:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:02.557 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:34:02 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:34:02 np0005626463.localdomain podman[256975]: 
Feb 23 09:34:02 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:34:02 np0005626463.localdomain podman[256975]: 2026-02-23 09:34:02.951086463 +0000 UTC m=+0.722801850 container create 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:34:02 np0005626463.localdomain podman[256975]: 2026-02-23 09:34:02.922494548 +0000 UTC m=+0.694209955 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.009 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:34:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.070 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.071 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.248 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.249 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12370MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.249 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.250 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.358 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:34:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 23 09:34:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:34:03 np0005626463.localdomain sshd[257042]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 23 09:34:03 np0005626463.localdomain python3[256887]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.809 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.814 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.836 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.839 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:34:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:03.839 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:34:03 np0005626463.localdomain sshd[257042]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:34:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 23 09:34:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:34:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully.
Feb 23 09:34:04 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully.
Feb 23 09:34:04 np0005626463.localdomain podman[257059]: 2026-02-23 09:34:04.842829928 +0000 UTC m=+0.104991001 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute)
Feb 23 09:34:04 np0005626463.localdomain podman[257059]: 2026-02-23 09:34:04.879475891 +0000 UTC m=+0.141636944 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:34:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:05.043 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 23 09:34:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:34:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:34:05 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:34:05 np0005626463.localdomain sudo[256885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:06.634 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:07 np0005626463.localdomain sudo[257196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbdandvmsyorxspkssnuuyidwolfnngq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839246.7654698-915-145642927860738/AnsiballZ_stat.py
Feb 23 09:34:07 np0005626463.localdomain sudo[257196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 23 09:34:07 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 143381 "" "Go-http-client/1.1"
Feb 23 09:34:07 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:34:07.254Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 23 09:34:07 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:34:07.255Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 23 09:34:07 np0005626463.localdomain podman_exporter[242941]: ts=2026-02-23T09:34:07.255Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Feb 23 09:34:07 np0005626463.localdomain python3.9[257198]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:34:07 np0005626463.localdomain sudo[257196]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:08 np0005626463.localdomain sudo[257309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzwdmcidpbcczhyzehkndvpwttklhjsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839248.1148958-942-259908730405758/AnsiballZ_file.py
Feb 23 09:34:08 np0005626463.localdomain sudo[257309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:08 np0005626463.localdomain python3.9[257311]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:08 np0005626463.localdomain sudo[257309]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:08 np0005626463.localdomain sudo[257364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwzoohcdyrvcqfqcbaymenswkzfmsljz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839248.1148958-942-259908730405758/AnsiballZ_stat.py
Feb 23 09:34:08 np0005626463.localdomain sudo[257364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:09 np0005626463.localdomain python3.9[257366]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:34:09 np0005626463.localdomain sudo[257364]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:34:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:34:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:34:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147334 "" "Go-http-client/1.1"
Feb 23 09:34:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47053 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8BA060000000001030307) 
Feb 23 09:34:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:34:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16227 "" "Go-http-client/1.1"
Feb 23 09:34:09 np0005626463.localdomain sudo[257476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqjnbvtehtdwblqrrwaafnlephsozcmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839249.0685031-942-199366317834695/AnsiballZ_copy.py
Feb 23 09:34:09 np0005626463.localdomain sudo[257476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:09 np0005626463.localdomain python3.9[257478]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839249.0685031-942-199366317834695/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:09 np0005626463.localdomain sudo[257476]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:09 np0005626463.localdomain sudo[257531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmiybrnhzshwscibboialdtxxpilzawv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839249.0685031-942-199366317834695/AnsiballZ_systemd.py
Feb 23 09:34:09 np0005626463.localdomain sudo[257531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:10 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:10.073 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:10 np0005626463.localdomain python3.9[257533]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:34:10 np0005626463.localdomain systemd-rc-local-generator[257560]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:34:10 np0005626463.localdomain systemd-sysv-generator[257564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:10 np0005626463.localdomain sudo[257531]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:10 np0005626463.localdomain sudo[257622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtqjcwbioqysahzhzrcibggwltmyxpbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839249.0685031-942-199366317834695/AnsiballZ_systemd.py
Feb 23 09:34:10 np0005626463.localdomain sudo[257622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:11 np0005626463.localdomain python3.9[257624]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:34:11 np0005626463.localdomain systemd-sysv-generator[257656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:34:11 np0005626463.localdomain systemd-rc-local-generator[257653]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:34:11 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:11.668 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 23 09:34:11 np0005626463.localdomain podman[257663]: 2026-02-23 09:34:11.798702621 +0000 UTC m=+0.084607479 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 23 09:34:11 np0005626463.localdomain podman[257663]: 2026-02-23 09:34:11.807225934 +0000 UTC m=+0.093130792 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:34:11 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 09:34:11 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:34:11 np0005626463.localdomain podman[257670]: 2026-02-23 09:34:11.901315364 +0000 UTC m=+0.176227060 container init 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=neutron_sriov_agent, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:34:11 np0005626463.localdomain podman[257670]: 2026-02-23 09:34:11.911114048 +0000 UTC m=+0.186025744 container start 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:34:11 np0005626463.localdomain podman[257670]: neutron_sriov_agent
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: + sudo -E kolla_set_configs
Feb 23 09:34:11 np0005626463.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 23 09:34:11 np0005626463.localdomain sudo[257622]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Validating config file
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Copying service configuration files
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Writing out command to execute
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:34:11 np0005626463.localdomain neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: ++ cat /run_command
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + ARGS=
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + sudo kolla_copy_cacerts
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + [[ ! -n '' ]]
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + . kolla_extend_start
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + umask 0022
Feb 23 09:34:12 np0005626463.localdomain neutron_sriov_agent[257694]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 23 09:34:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:34:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:34:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:34:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:34:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:34:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.518 2 INFO neutron.common.config [-] Logging enabled!
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005626463.localdomain'}
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.520 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] RPC agent_id: nic-switch-agent.np0005626463.localdomain
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.524 2 INFO neutron.agent.agent_extensions_manager [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Loaded agent extensions: ['qos']
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.524 2 INFO neutron.agent.agent_extensions_manager [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Initializing agent extension 'qos'
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Agent initialized successfully, now running... 
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 23 09:34:13 np0005626463.localdomain neutron_sriov_agent[257694]: 2026-02-23 09:34:13.940 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Agent out of sync with plugin!
Feb 23 09:34:15 np0005626463.localdomain python3.9[257817]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:34:15 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:15.095 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:16 np0005626463.localdomain sudo[257925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zffybhrtvpawjqpxlvuuzeptsjcmwmkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839255.7592146-1077-153905536064629/AnsiballZ_stat.py
Feb 23 09:34:16 np0005626463.localdomain sudo[257925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:16 np0005626463.localdomain python3.9[257927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:16 np0005626463.localdomain sudo[257925]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:16 np0005626463.localdomain sudo[258015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjwrkbuqjkelbkhyrzfrulajbcdedevm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839255.7592146-1077-153905536064629/AnsiballZ_copy.py
Feb 23 09:34:16 np0005626463.localdomain sudo[258015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:16 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:16.711 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:16 np0005626463.localdomain python3.9[258017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839255.7592146-1077-153905536064629/.source.yaml _original_basename=.ft4p9v8o follow=False checksum=b7da0e0729778df3fa2e9c064ac77c610bc22800 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:16 np0005626463.localdomain sudo[258015]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:34:17 np0005626463.localdomain podman[258035]: 2026-02-23 09:34:17.919585805 +0000 UTC m=+0.097307955 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:34:17 np0005626463.localdomain podman[258035]: 2026-02-23 09:34:17.928121828 +0000 UTC m=+0.105843928 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:34:17 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:34:18 np0005626463.localdomain sudo[258148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfxqecgttthkhrreembywdvckneumzwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839258.3810909-1122-82046731664623/AnsiballZ_systemd.py
Feb 23 09:34:18 np0005626463.localdomain sudo[258148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:18 np0005626463.localdomain python3.9[258150]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: libpod-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae.scope: Deactivated successfully.
Feb 23 09:34:19 np0005626463.localdomain podman[258154]: 2026-02-23 09:34:19.106942249 +0000 UTC m=+0.074936258 container died 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: libpod-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae.scope: Consumed 1.708s CPU time.
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: tmp-crun.2E4u2c.mount: Deactivated successfully.
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae-userdata-shm.mount: Deactivated successfully.
Feb 23 09:34:19 np0005626463.localdomain podman[258154]: 2026-02-23 09:34:19.172994103 +0000 UTC m=+0.140988102 container cleanup 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=neutron_sriov_agent, managed_by=edpm_ansible)
Feb 23 09:34:19 np0005626463.localdomain podman[258154]: neutron_sriov_agent
Feb 23 09:34:19 np0005626463.localdomain podman[258180]: 2026-02-23 09:34:19.26259455 +0000 UTC m=+0.053898465 container cleanup 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_id=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:34:19 np0005626463.localdomain podman[258180]: neutron_sriov_agent
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:34:19 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 09:34:19 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:34:19 np0005626463.localdomain podman[258192]: 2026-02-23 09:34:19.397432815 +0000 UTC m=+0.106237161 container init 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Feb 23 09:34:19 np0005626463.localdomain podman[258192]: 2026-02-23 09:34:19.406234456 +0000 UTC m=+0.115038802 container start 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:34:19 np0005626463.localdomain podman[258192]: neutron_sriov_agent
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + sudo -E kolla_set_configs
Feb 23 09:34:19 np0005626463.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 23 09:34:19 np0005626463.localdomain sudo[258148]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Validating config file
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Copying service configuration files
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Writing out command to execute
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: ++ cat /run_command
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + ARGS=
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + sudo kolla_copy_cacerts
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + [[ ! -n '' ]]
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + . kolla_extend_start
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + umask 0022
Feb 23 09:34:19 np0005626463.localdomain neutron_sriov_agent[258207]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 23 09:34:20 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:20.128 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:20 np0005626463.localdomain sshd[252900]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:34:20 np0005626463.localdomain systemd-logind[759]: Session 57 logged out. Waiting for processes to exit.
Feb 23 09:34:20 np0005626463.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Feb 23 09:34:20 np0005626463.localdomain systemd[1]: session-57.scope: Consumed 22.603s CPU time.
Feb 23 09:34:20 np0005626463.localdomain systemd-logind[759]: Removed session 57.
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.011 2 INFO neutron.common.config [-] Logging enabled!
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.011 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005626463.localdomain'}
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.013 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] RPC agent_id: nic-switch-agent.np0005626463.localdomain
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.017 2 INFO neutron.agent.agent_extensions_manager [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Loaded agent extensions: ['qos']
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.017 2 INFO neutron.agent.agent_extensions_manager [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Initializing agent extension 'qos'
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.143 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Agent initialized successfully, now running... 
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.144 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 23 09:34:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:34:21.144 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Agent out of sync with plugin!
Feb 23 09:34:21 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:21.746 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11435 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8F3600000000001030307) 
Feb 23 09:34:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11436 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8F7860000000001030307) 
Feb 23 09:34:25 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:25.159 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47054 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8FA060000000001030307) 
Feb 23 09:34:26 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:26.785 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:26 np0005626463.localdomain sshd[258240]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:26 np0005626463.localdomain sshd[258240]: Accepted publickey for zuul from 192.168.122.30 port 53908 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:34:27 np0005626463.localdomain systemd-logind[759]: New session 58 of user zuul.
Feb 23 09:34:27 np0005626463.localdomain systemd[1]: Started Session 58 of User zuul.
Feb 23 09:34:27 np0005626463.localdomain sshd[258240]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:34:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11437 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8FF860000000001030307) 
Feb 23 09:34:28 np0005626463.localdomain python3.9[258351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:34:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49498 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF904070000000001030307) 
Feb 23 09:34:29 np0005626463.localdomain sudo[258463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdqaapanqmakggtmfrphiyvgpkubdbuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839268.8459153-60-210843627002732/AnsiballZ_setup.py
Feb 23 09:34:29 np0005626463.localdomain sudo[258463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:34:29 np0005626463.localdomain podman[258466]: 2026-02-23 09:34:29.228305706 +0000 UTC m=+0.085836037 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1770267347, io.buildah.version=1.33.7, version=9.7)
Feb 23 09:34:29 np0005626463.localdomain podman[258466]: 2026-02-23 09:34:29.245236108 +0000 UTC m=+0.102766459 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:34:29 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:34:29 np0005626463.localdomain python3.9[258465]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:34:29 np0005626463.localdomain sudo[258463]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:30 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:30.162 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:30 np0005626463.localdomain sudo[258545]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joqknbvfwptcstlrwivobhbozxteyouv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839268.8459153-60-210843627002732/AnsiballZ_dnf.py
Feb 23 09:34:30 np0005626463.localdomain sudo[258545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:30 np0005626463.localdomain python3.9[258547]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:34:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11438 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF90F460000000001030307) 
Feb 23 09:34:31 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:31.826 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:34:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:34:33 np0005626463.localdomain systemd[1]: tmp-crun.fwAAHu.mount: Deactivated successfully.
Feb 23 09:34:33 np0005626463.localdomain podman[258550]: 2026-02-23 09:34:33.922921291 +0000 UTC m=+0.094563787 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:34:33 np0005626463.localdomain podman[258551]: 2026-02-23 09:34:33.986281889 +0000 UTC m=+0.157802761 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:34:34 np0005626463.localdomain podman[258551]: 2026-02-23 09:34:33.999536133 +0000 UTC m=+0.171057015 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:34:34 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:34:34 np0005626463.localdomain podman[258550]: 2026-02-23 09:34:34.048378516 +0000 UTC m=+0.220020942 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:34:34 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:34:34 np0005626463.localdomain sudo[258545]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:34 np0005626463.localdomain sudo[258704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyrvoarbxgdchhkvgmmcwtuiikqdrcss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839274.270253-96-82178019592404/AnsiballZ_systemd.py
Feb 23 09:34:34 np0005626463.localdomain sudo[258704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:35 np0005626463.localdomain python3.9[258706]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 23 09:34:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:35.194 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:35 np0005626463.localdomain sudo[258704]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:34:36 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:36.871 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:36 np0005626463.localdomain podman[258786]: 2026-02-23 09:34:36.933852738 +0000 UTC m=+0.107906606 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 23 09:34:36 np0005626463.localdomain sudo[258829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfxypeqwrqtigqgorbnqwurmlmvjzplr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839276.530838-123-101547068006507/AnsiballZ_file.py
Feb 23 09:34:36 np0005626463.localdomain podman[258786]: 2026-02-23 09:34:36.94922639 +0000 UTC m=+0.123280268 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:34:36 np0005626463.localdomain sudo[258829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:36 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:34:37 np0005626463.localdomain python3.9[258838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:37 np0005626463.localdomain sudo[258829]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:37 np0005626463.localdomain sudo[258946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bemmsklqudjqrfumxwjardvepayeozox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839277.26996-123-237244886117095/AnsiballZ_file.py
Feb 23 09:34:37 np0005626463.localdomain sudo[258946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:37 np0005626463.localdomain python3.9[258948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:37 np0005626463.localdomain sudo[258946]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:38 np0005626463.localdomain sshd[259024]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:38 np0005626463.localdomain sudo[259058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcqfkxjbggbzshilgmweimeatbbpphua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839277.892524-123-257195668532798/AnsiballZ_file.py
Feb 23 09:34:38 np0005626463.localdomain sudo[259058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:38 np0005626463.localdomain python3.9[259060]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:38 np0005626463.localdomain sudo[259058]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:38 np0005626463.localdomain sshd[259024]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:34:38 np0005626463.localdomain sudo[259168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgddcpahpfhxgvposwmtfdrevwrmbzve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839278.4993494-123-251607678088662/AnsiballZ_file.py
Feb 23 09:34:38 np0005626463.localdomain sudo[259168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:38 np0005626463.localdomain python3.9[259170]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:38 np0005626463.localdomain sudo[259168]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:39 np0005626463.localdomain sudo[259278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqguhawyvlejfmsndirgqkchnjtqepqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839279.0664408-123-203525927524036/AnsiballZ_file.py
Feb 23 09:34:39 np0005626463.localdomain sudo[259278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:34:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:34:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:34:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147333 "" "Go-http-client/1.1"
Feb 23 09:34:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:34:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16352 "" "Go-http-client/1.1"
Feb 23 09:34:39 np0005626463.localdomain python3.9[259280]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:39 np0005626463.localdomain sudo[259278]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11439 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF930060000000001030307) 
Feb 23 09:34:39 np0005626463.localdomain sudo[259388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tooshtibabpkdefebffumtqnmdjjkgxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839279.7034466-123-106318200912708/AnsiballZ_file.py
Feb 23 09:34:39 np0005626463.localdomain sudo[259388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:40 np0005626463.localdomain python3.9[259390]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:40 np0005626463.localdomain sudo[259388]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:40.228 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:40 np0005626463.localdomain sudo[259498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoltitxilajmxlqwlmsffmllzwgodwve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839280.34817-123-241791863654007/AnsiballZ_file.py
Feb 23 09:34:40 np0005626463.localdomain sudo[259498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:40 np0005626463.localdomain python3.9[259500]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:40 np0005626463.localdomain sudo[259498]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:41 np0005626463.localdomain sudo[259608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvxzpxzosjnfnzhgtqfnyknqbrkaarug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839281.0231736-273-176270633770242/AnsiballZ_stat.py
Feb 23 09:34:41 np0005626463.localdomain sudo[259608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:41 np0005626463.localdomain python3.9[259610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:41 np0005626463.localdomain sudo[259608]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:41 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:41.906 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:42 np0005626463.localdomain sudo[259696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdjjimdllpruuxlxepbxrkdrqnzisphq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839281.0231736-273-176270633770242/AnsiballZ_copy.py
Feb 23 09:34:42 np0005626463.localdomain sudo[259696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:34:42 np0005626463.localdomain systemd[1]: tmp-crun.0KUEqe.mount: Deactivated successfully.
Feb 23 09:34:42 np0005626463.localdomain podman[259699]: 2026-02-23 09:34:42.292829225 +0000 UTC m=+0.086283602 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:34:42 np0005626463.localdomain podman[259699]: 2026-02-23 09:34:42.327238201 +0000 UTC m=+0.120692609 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:34:42 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:34:42 np0005626463.localdomain python3.9[259698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839281.0231736-273-176270633770242/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:42 np0005626463.localdomain sudo[259696]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:43 np0005626463.localdomain python3.9[259824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:34:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:34:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:34:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:34:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:34:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:34:43 np0005626463.localdomain python3.9[259910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839282.5792007-318-270059151978062/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:43 np0005626463.localdomain sshd[260020]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:44 np0005626463.localdomain python3.9[260019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:44 np0005626463.localdomain sshd[260020]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:34:44 np0005626463.localdomain python3.9[260107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839283.673694-318-82219066607471/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:45 np0005626463.localdomain python3.9[260215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:45.276 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:45 np0005626463.localdomain python3.9[260301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839284.675606-318-142287225081878/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=cb6fb0641ad99e101e98bdc096471f5e2f31a05c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:46 np0005626463.localdomain python3.9[260409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:46.937 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:47 np0005626463.localdomain python3.9[260495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839286.3943536-492-61306009228428/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=d10c6f671263070bdc94fed977552f121764373c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:47 np0005626463.localdomain python3.9[260603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:48 np0005626463.localdomain python3.9[260689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839287.510439-537-72688536006185/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:34:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:34:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:34:48.535 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:34:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:34:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:34:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:34:48 np0005626463.localdomain podman[260778]: 2026-02-23 09:34:48.90964953 +0000 UTC m=+0.080933657 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:34:48 np0005626463.localdomain podman[260778]: 2026-02-23 09:34:48.917913184 +0000 UTC m=+0.089197331 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:34:48 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:34:49 np0005626463.localdomain python3.9[260808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:49 np0005626463.localdomain python3.9[260906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839288.6223347-537-565126377649/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:50 np0005626463.localdomain python3.9[261014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:50 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:50.315 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:50 np0005626463.localdomain python3.9[261069]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:51 np0005626463.localdomain python3.9[261177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:51 np0005626463.localdomain python3.9[261263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839290.8308282-624-65563057521069/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:51 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:51.976 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:52 np0005626463.localdomain sudo[261372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:34:52 np0005626463.localdomain sudo[261372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:34:52 np0005626463.localdomain sudo[261372]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:52 np0005626463.localdomain python3.9[261371]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:34:53 np0005626463.localdomain sudo[261390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:34:53 np0005626463.localdomain sudo[261390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:34:53 np0005626463.localdomain sudo[261556]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llyehbtlakicaudoaoeevmdnswgtadhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839293.2811635-729-138803506461649/AnsiballZ_file.py
Feb 23 09:34:53 np0005626463.localdomain sudo[261556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:53 np0005626463.localdomain python3.9[261562]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:53 np0005626463.localdomain sudo[261556]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:53 np0005626463.localdomain podman[261591]: 2026-02-23 09:34:53.781230774 +0000 UTC m=+0.085955562 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, release=1770267347, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64)
Feb 23 09:34:53 np0005626463.localdomain podman[261591]: 2026-02-23 09:34:53.897157624 +0000 UTC m=+0.201882352 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:34:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1607 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF968900000000001030307) 
Feb 23 09:34:54 np0005626463.localdomain sudo[261390]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:54 np0005626463.localdomain sudo[261711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:34:54 np0005626463.localdomain sudo[261711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:34:54 np0005626463.localdomain sudo[261711]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:54 np0005626463.localdomain sudo[261745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:34:54 np0005626463.localdomain sudo[261745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:34:54 np0005626463.localdomain sudo[261799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unqmanouaetqbmiurovechoxvdbhghdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839294.1399086-753-177962544129192/AnsiballZ_stat.py
Feb 23 09:34:54 np0005626463.localdomain sudo[261799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:54 np0005626463.localdomain sshd[261802]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:34:54 np0005626463.localdomain python3.9[261801]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:54 np0005626463.localdomain sudo[261799]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:54 np0005626463.localdomain sudo[261875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jksgpsxnmnksxdcczthyrpdvxxoahdpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839294.1399086-753-177962544129192/AnsiballZ_file.py
Feb 23 09:34:54 np0005626463.localdomain sudo[261875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:54 np0005626463.localdomain sudo[261745]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:55 np0005626463.localdomain python3.9[261877]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:55 np0005626463.localdomain sudo[261875]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1608 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF96C860000000001030307) 
Feb 23 09:34:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:55.361 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:55 np0005626463.localdomain sudo[262011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifjqiwhbsfoieofrvjqkcndbvpxrrqfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839295.152661-753-255265410866044/AnsiballZ_stat.py
Feb 23 09:34:55 np0005626463.localdomain sudo[262011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:55 np0005626463.localdomain sudo[261990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:34:55 np0005626463.localdomain sudo[261990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:34:55 np0005626463.localdomain sudo[261990]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:55 np0005626463.localdomain sshd[261802]: Connection closed by authenticating user root 185.156.73.233 port 41376 [preauth]
Feb 23 09:34:55 np0005626463.localdomain python3.9[262019]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:55 np0005626463.localdomain sudo[262011]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:55 np0005626463.localdomain sudo[262075]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtnhwdfieivlxmrmqvujcyrpwtvbpuyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839295.152661-753-255265410866044/AnsiballZ_file.py
Feb 23 09:34:55 np0005626463.localdomain sudo[262075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:56 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11440 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF970060000000001030307) 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 55800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adc72d6a-d082-41ee-a29b-2423f69e5a9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55800000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:34:56.132006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e9f77934-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.340275174, 'message_signature': '46c3c9c904b885d8401956e1a210f6b16501c6c05b7e8d6a9f58953d5827b628'}]}, 'timestamp': '2026-02-23 09:34:56.151659', '_unique_id': 'fa9020d37890426db489d6d4a1fc1931'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.156 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c8491be-13c9-4312-9787-aac2c47f8203', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.154704', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f85c8c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '91a00fe089596d72b867cd20a1d20b97d4c614ddf705adae92364e28c1fe08af'}]}, 'timestamp': '2026-02-23 09:34:56.157455', '_unique_id': 'ea83552615cc4ec986599488f5837f41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.159 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52a8fafe-df1e-45c2-8ce1-fafc1ee93355', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.159794', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f8cd2a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'dcde5105fe79e4a6b3e1eaf5c5352c4b400cf3a2da9026eccd550339061a6e28'}]}, 'timestamp': '2026-02-23 09:34:56.160320', '_unique_id': '345e726edcd546488f54236a880aadbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.162 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b06f8c7-c1e0-4e2f-8ed9-dad6d67629c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.162551', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f9377e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'c8e4bfa84e008e7e9530c9429af1d47a8e54b50c4604a6a89463b0de3ef48c18'}]}, 'timestamp': '2026-02-23 09:34:56.163071', '_unique_id': 'f952036d854647e8b347a62d135b460e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain python3.9[262077]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9aea7f6e-f5f0-4637-ba6c-27e7f40e07f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.165399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9fe9750-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '3c96611165d55c822a12bc9d5003ec80689bf5bd55ddbd7f376fab9e7d1f404f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.165399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9feabbe-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '416e9daaae04fb16c0451852af84685f0672d41e4c28f5f9275bcde8c6640e22'}]}, 'timestamp': '2026-02-23 09:34:56.198777', '_unique_id': 'b200d0f07d2a413e8e22c190a45b6b8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73d5e210-0a91-48ef-a34d-6b3c6d05acfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.201633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9ff2f62-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '682b36a2ad1bd8bd598ca62f2ab4b61ca74174613bf1680c190d7a0067443653'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.201633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9ff40f6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'f35e2be863eee16b5e109164f430d135c119729b535984482730951b1322d85f'}]}, 'timestamp': '2026-02-23 09:34:56.202581', '_unique_id': '648f54a974154782b58680c796048655'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d19a4dc-f6f8-4dbb-9f89-9be247e8ee32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.205305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9ffbd9c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '3e45c434dbed7d7541801a695e619509e3c5a86c2a359125b5fc7b1872f52da2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.205305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9ffd052-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'b741c3aa70c6fb4e4213053d2110c182dc64797f611b46e3081ee0b6abc2d5f2'}]}, 'timestamp': '2026-02-23 09:34:56.206255', '_unique_id': '0c842361abe04d4e96b8fb919eed6586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e193fd5-2dbd-492f-9c89-68d3257a9c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.208603', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea003e8e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'f0ba8526d9f1d00442adce0a3f388a780b4549d64745e2b81595badc71e049c8'}]}, 'timestamp': '2026-02-23 09:34:56.209173', '_unique_id': '72d61fd59d3447cf8231496d15f1b2b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain sudo[262075]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.211 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df6079ee-084c-4846-9c4e-9cce4dd02e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.211443', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea00ad42-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '462196be923fa0169a7eae174d41ab92b795eedc2585593a0e2f0f623b44705e'}]}, 'timestamp': '2026-02-23 09:34:56.211972', '_unique_id': '2fce48a7a2ca452e96ad2143af2cb246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e90603f3-96af-4b40-b8aa-c0af4d413cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.214248', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea011aca-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'd9a9e30b09db5b5daf32d195d74c86a4fc708bb36602b59907101d6387a1dcf6'}]}, 'timestamp': '2026-02-23 09:34:56.214745', '_unique_id': '4c47e6387cda4d29acabd13fe610e61b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01468b17-e61c-42a1-9ab0-70377e217073', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.217548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea019bb2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '814b3d2e210177cef4c5e26e9dc2e27c525cf0dfec95a24975b79c7169c7200b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.217548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea01ae5e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '49d6354e987d0bada8fded9803d0770bce939104ce3dfc0d08a8d4d4acf43f28'}]}, 'timestamp': '2026-02-23 09:34:56.218494', '_unique_id': 'a4a86eb6dd564fa2bc534dff2cbd9e8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '427303af-2c21-46d4-9c77-0a7b8f1b2c31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.220756', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea021ba0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '8d37f606d015c31749718bb6f9f35c465eaafbcc0808be3467cedd50bbe45591'}]}, 'timestamp': '2026-02-23 09:34:56.221320', '_unique_id': '34189cd17bdb4795b7be01ac3e13bb38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '800c2297-7595-48ed-bf71-b8cb7af2a64f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:34:56.223520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ea0284b4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.340275174, 'message_signature': 'fe3d294713f1b9ebcb1d1b82c0bb8288efa42172cdcc5696317a081171cf7cc7'}]}, 'timestamp': '2026-02-23 09:34:56.224044', '_unique_id': '06657e34ea1846988925282f76d60184'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2509cf08-d092-43dc-aaf6-f2ebf654c724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.226428', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea02f656-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '67b7a2106eb29cf9b06fcb635174ba201ff977722dc28b5e999d8338e472d6f4'}]}, 'timestamp': '2026-02-23 09:34:56.226951', '_unique_id': 'c8b9f3fd642e4b77af7166bff6412f01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d17d67-692a-41b4-b157-caeb5fd7d07a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.229186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea051d0a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'f5340bfc36bc0fd585598d311a74a7a6fb70910318f1054121483e7bb69e5e95'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.229186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea052980-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'badb59891334d73b82d573ea992639b614302e99b1350cca1efcf768dbc2bb5d'}]}, 'timestamp': '2026-02-23 09:34:56.241217', '_unique_id': '441e7112ccd34d5d98790c9a2878bd7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca41c10c-87f8-484f-b7eb-0854c52abe00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.242656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea056c6a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '531bba6cd0ca96f5def37bd38c05e4520a543c883be8b458c1d891c3d967fe0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.242656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea057796-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '0bafd189d51526a1ba82ac0b686ff0408639be521d3e80ccc93a0d6a19bc2a8a'}]}, 'timestamp': '2026-02-23 09:34:56.243213', '_unique_id': '568c42c9c28e478b92f6389c52cb31a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49e0ed73-b464-48d4-bade-5a8c76d99dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.244611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea05b85a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'bccc1fad628d016cb1e99b1a461d4b2a77e5eca6718ede932ae815260ed03a39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.244611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea05c3cc-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': '032df660b0a71ed0d1725fe82df0f873acd44155988eebfafff607f9603c157e'}]}, 'timestamp': '2026-02-23 09:34:56.245164', '_unique_id': '73853cad79ef4949899837acae2bf228'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '220027b5-b5e9-4298-b7a6-00a7cfd6977c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.246472', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea060148-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'c1b1184af8a56c8b061abc614ccd34f593b024b137fda56673c74ffc1a027f9d'}]}, 'timestamp': '2026-02-23 09:34:56.246755', '_unique_id': '798dde2eb9aa448fbaf1138c8dd4b0cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53c6fe9b-554d-4c28-b317-409f4852a9e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.248056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea063f00-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '9716e99ec14b09d009ad5f3a5f66d37c1c69b54b9abc404280528dc3f87214d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.248056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea06489c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'f0038ed9b882beb0155675ba0dff0cbc1c548efba7711e8d93da2dafff7b56b9'}]}, 'timestamp': '2026-02-23 09:34:56.248561', '_unique_id': '414bd1202e1b4d23acb9e6586b4b614b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cfbee12-84d5-4d4f-bbd4-b932816d1a0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.250058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea069324-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': '9570e4604484cf4cf85e2b5dd500fe534c16032b8e31022354f10e28a6402f73'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.250058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea069d4c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'e9b6ec69729fd7f1234a7ee677dcc4ab3f67c97b07d549af26de9711e9a9f275'}]}, 'timestamp': '2026-02-23 09:34:56.250733', '_unique_id': '5c5ddf8711e44c08a34188f244626d4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9a92e77-d891-4aa3-b39b-aade3e2bd94f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.252139', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea06dea6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '76a987dcc1cecdac7c68a68adf4833f335296709573f0d4229fd05978677584c'}]}, 'timestamp': '2026-02-23 09:34:56.252423', '_unique_id': 'ed4b0e4673294ac7a8019f9bcd0a79af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:34:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:34:56 np0005626463.localdomain sudo[262185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrrflkjsgndmghmdexzsyokllsoqhwjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839296.351888-822-86433147571278/AnsiballZ_file.py
Feb 23 09:34:56 np0005626463.localdomain sudo[262185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:56 np0005626463.localdomain python3.9[262187]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:56 np0005626463.localdomain sudo[262185]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:57.018 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:34:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1609 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF974860000000001030307) 
Feb 23 09:34:57 np0005626463.localdomain sudo[262295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krgcmcpdykqzajmvzljqidvadmhnvdpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839297.048972-846-179064624266554/AnsiballZ_stat.py
Feb 23 09:34:57 np0005626463.localdomain sudo[262295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:57 np0005626463.localdomain python3.9[262297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:57 np0005626463.localdomain sudo[262295]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:57 np0005626463.localdomain sudo[262352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdakhnjoknzexctqilnjiinsuetxkkcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839297.048972-846-179064624266554/AnsiballZ_file.py
Feb 23 09:34:57 np0005626463.localdomain sudo[262352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:57.840 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:34:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:57.841 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:34:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:57.841 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:34:57 np0005626463.localdomain python3.9[262354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:57 np0005626463.localdomain sudo[262352]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47055 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF978060000000001030307) 
Feb 23 09:34:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:58.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:34:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:58.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:34:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:58.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:34:58 np0005626463.localdomain sudo[262462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dknlhlmwkcbpsasskrngnvmqchqktlnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839298.562014-882-53882355133745/AnsiballZ_stat.py
Feb 23 09:34:58 np0005626463.localdomain sudo[262462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:59 np0005626463.localdomain python3.9[262464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:34:59 np0005626463.localdomain sudo[262462]: pam_unix(sudo:session): session closed for user root
Feb 23 09:34:59 np0005626463.localdomain sudo[262519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vextqrazvsrhcmxliwvntgyxzbhbuilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839298.562014-882-53882355133745/AnsiballZ_file.py
Feb 23 09:34:59 np0005626463.localdomain sudo[262519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:34:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:34:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:34:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:34:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:34:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:34:59.357 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:34:59 np0005626463.localdomain systemd[1]: tmp-crun.f7p3VO.mount: Deactivated successfully.
Feb 23 09:34:59 np0005626463.localdomain podman[262522]: 2026-02-23 09:34:59.443386243 +0000 UTC m=+0.089033447 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal)
Feb 23 09:34:59 np0005626463.localdomain podman[262522]: 2026-02-23 09:34:59.457038042 +0000 UTC m=+0.102685216 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, version=9.7, maintainer=Red Hat, Inc.)
Feb 23 09:34:59 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:34:59 np0005626463.localdomain python3.9[262521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:34:59 np0005626463.localdomain sudo[262519]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.361 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.456 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.478 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.479 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:00 np0005626463.localdomain sudo[262649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyduxcgcupvmxsnhkyvsxneactiamfvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839299.7160487-918-64881777467906/AnsiballZ_systemd.py
Feb 23 09:35:00 np0005626463.localdomain sudo[262649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:01 np0005626463.localdomain python3.9[262651]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:35:01 np0005626463.localdomain systemd-rc-local-generator[262679]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:35:01 np0005626463.localdomain systemd-sysv-generator[262682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:35:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1610 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF984460000000001030307) 
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:01 np0005626463.localdomain sudo[262649]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:01.476 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:01.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:01 np0005626463.localdomain sudo[262797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rckibczkyujjjbbgxtctheiwrgrbecgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839301.6145036-942-177637575340785/AnsiballZ_stat.py
Feb 23 09:35:01 np0005626463.localdomain sudo[262797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:02 np0005626463.localdomain python3.9[262799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:35:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:02.068 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:02 np0005626463.localdomain sudo[262797]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:02 np0005626463.localdomain sudo[262854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgoihzisgmlujoasnbmrzxvifxvfdavg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839301.6145036-942-177637575340785/AnsiballZ_file.py
Feb 23 09:35:02 np0005626463.localdomain sudo[262854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:02 np0005626463.localdomain python3.9[262856]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:02 np0005626463.localdomain sudo[262854]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:03 np0005626463.localdomain sudo[262964]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctnsseumxfchzeuzrrmraegkyivkpneg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839302.7725263-978-76147105238575/AnsiballZ_stat.py
Feb 23 09:35:03 np0005626463.localdomain sudo[262964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:03 np0005626463.localdomain python3.9[262966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:35:03 np0005626463.localdomain sudo[262964]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:03 np0005626463.localdomain sudo[263021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uurymsfdqcghmyctkbzyiyhjpksunten ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839302.7725263-978-76147105238575/AnsiballZ_file.py
Feb 23 09:35:03 np0005626463.localdomain sudo[263021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.566 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.567 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.567 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.568 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:35:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:03.568 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:35:03 np0005626463.localdomain python3.9[263023]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:03 np0005626463.localdomain sudo[263021]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.016 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.088 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.089 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:35:04 np0005626463.localdomain sudo[263153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtjmfyowxdqgygdbzidlwsfsvbvqxlau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839303.9562488-1014-8646968573708/AnsiballZ_systemd.py
Feb 23 09:35:04 np0005626463.localdomain sudo[263153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.326 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.328 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12244MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.328 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.329 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: tmp-crun.L6KTKV.mount: Deactivated successfully.
Feb 23 09:35:04 np0005626463.localdomain podman[263155]: 2026-02-23 09:35:04.393046886 +0000 UTC m=+0.094643178 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.413 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.414 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.414 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.456 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:35:04 np0005626463.localdomain podman[263157]: 2026-02-23 09:35:04.464089608 +0000 UTC m=+0.164984439 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:35:04 np0005626463.localdomain podman[263157]: 2026-02-23 09:35:04.476208811 +0000 UTC m=+0.177103632 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:35:04 np0005626463.localdomain podman[263155]: 2026-02-23 09:35:04.499328321 +0000 UTC m=+0.200924573 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:35:04 np0005626463.localdomain python3.9[263156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:35:04 np0005626463.localdomain systemd-rc-local-generator[263249]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:35:04 np0005626463.localdomain systemd-sysv-generator[263252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: Starting Create netns directory...
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 23 09:35:04 np0005626463.localdomain systemd[1]: Finished Create netns directory.
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.957 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.964 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.982 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:35:04 np0005626463.localdomain sudo[263153]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.985 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:35:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:04.986 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:35:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:05.363 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:05 np0005626463.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 23 09:35:06 np0005626463.localdomain sudo[263373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edaijikwiendgjrxnecegtnvxmxrkydj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839306.4646149-1044-54844320922289/AnsiballZ_file.py
Feb 23 09:35:06 np0005626463.localdomain sudo[263373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:06 np0005626463.localdomain python3.9[263375]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:06 np0005626463.localdomain sudo[263373]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:07.111 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:07 np0005626463.localdomain sudo[263483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izadbewlrjpmgfgtsfzrlmjopuhhpulk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839307.1255882-1068-172636218438349/AnsiballZ_file.py
Feb 23 09:35:07 np0005626463.localdomain sudo[263483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:35:07 np0005626463.localdomain systemd[1]: tmp-crun.MlsUM6.mount: Deactivated successfully.
Feb 23 09:35:07 np0005626463.localdomain podman[263486]: 2026-02-23 09:35:07.488691838 +0000 UTC m=+0.097180517 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:35:07 np0005626463.localdomain podman[263486]: 2026-02-23 09:35:07.502278345 +0000 UTC m=+0.110767024 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:35:07 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:35:07 np0005626463.localdomain python3.9[263485]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:35:07 np0005626463.localdomain sudo[263483]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:08 np0005626463.localdomain sudo[263611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhydwzsctcukrmksystviwcbckjliptm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839307.8117497-1092-18518264836399/AnsiballZ_stat.py
Feb 23 09:35:08 np0005626463.localdomain sudo[263611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:08 np0005626463.localdomain python3.9[263613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:35:08 np0005626463.localdomain sudo[263611]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:08 np0005626463.localdomain sudo[263699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeuanzrikbgrjvxbtvbrvaaouecamnnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839307.8117497-1092-18518264836399/AnsiballZ_copy.py
Feb 23 09:35:08 np0005626463.localdomain sudo[263699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:08 np0005626463.localdomain python3.9[263701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839307.8117497-1092-18518264836399/.source.json _original_basename=.w8nqsbxi follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:08 np0005626463.localdomain sudo[263699]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1611 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9A4060000000001030307) 
Feb 23 09:35:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:35:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:35:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:35:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147333 "" "Go-http-client/1.1"
Feb 23 09:35:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:35:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16348 "" "Go-http-client/1.1"
Feb 23 09:35:09 np0005626463.localdomain python3.9[263809]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:10 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:10.382 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:12 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:12.162 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:35:12 np0005626463.localdomain sudo[264111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmtjfzlrndkxwztjtjdiqzuytmvvcaqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839311.703984-1212-113361114144214/AnsiballZ_container_config_data.py
Feb 23 09:35:12 np0005626463.localdomain sudo[264111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:12 np0005626463.localdomain podman[264113]: 2026-02-23 09:35:12.90872267 +0000 UTC m=+0.083973521 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 09:35:12 np0005626463.localdomain podman[264113]: 2026-02-23 09:35:12.91491262 +0000 UTC m=+0.090163501 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 23 09:35:12 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:35:13 np0005626463.localdomain python3.9[264119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Feb 23 09:35:13 np0005626463.localdomain sudo[264111]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:35:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:35:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:35:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:35:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:35:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:35:14 np0005626463.localdomain sudo[264238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvzrdnumlcwcbwaybjjlxaxpoamhgmgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839314.5884447-1245-269376512401012/AnsiballZ_container_config_hash.py
Feb 23 09:35:14 np0005626463.localdomain sudo[264238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:15 np0005626463.localdomain python3.9[264240]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:35:15 np0005626463.localdomain sudo[264238]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:15 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:15.416 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:16 np0005626463.localdomain sudo[264348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlnmurjuhhdncnpavfgoawswzynebicb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771839315.5823689-1275-68540145760056/AnsiballZ_edpm_container_manage.py
Feb 23 09:35:16 np0005626463.localdomain sudo[264348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:16 np0005626463.localdomain sshd[264351]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:35:16 np0005626463.localdomain python3[264350]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:35:16 np0005626463.localdomain podman[264389]: 
Feb 23 09:35:16 np0005626463.localdomain sshd[264351]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:35:16 np0005626463.localdomain podman[264389]: 2026-02-23 09:35:16.654860193 +0000 UTC m=+0.083482526 container create 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:35:16 np0005626463.localdomain podman[264389]: 2026-02-23 09:35:16.610531011 +0000 UTC m=+0.039153374 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:35:16 np0005626463.localdomain python3[264350]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:35:16 np0005626463.localdomain sudo[264348]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:17.208 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:17 np0005626463.localdomain sudo[264535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njmoewyfappepqwmiklaecjnyurupklo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839317.0203927-1299-224231896979558/AnsiballZ_stat.py
Feb 23 09:35:17 np0005626463.localdomain sudo[264535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:17 np0005626463.localdomain python3.9[264537]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:35:17 np0005626463.localdomain sudo[264535]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:18 np0005626463.localdomain sudo[264647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrhfqgzayfktwrgfclixiyiyqoamynxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839317.850267-1326-257380492238693/AnsiballZ_file.py
Feb 23 09:35:18 np0005626463.localdomain sudo[264647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:18 np0005626463.localdomain python3.9[264649]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:18 np0005626463.localdomain sudo[264647]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:18 np0005626463.localdomain sudo[264702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utvuhjkfvpqmrftotcjwwobzbsvvakmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839317.850267-1326-257380492238693/AnsiballZ_stat.py
Feb 23 09:35:18 np0005626463.localdomain sudo[264702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:18 np0005626463.localdomain sshd[264705]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:35:18 np0005626463.localdomain python3.9[264704]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:35:18 np0005626463.localdomain sudo[264702]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:19 np0005626463.localdomain sshd[264705]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:35:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:35:19 np0005626463.localdomain systemd[1]: tmp-crun.GIdblx.mount: Deactivated successfully.
Feb 23 09:35:19 np0005626463.localdomain podman[264761]: 2026-02-23 09:35:19.133414569 +0000 UTC m=+0.092620816 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:35:19 np0005626463.localdomain podman[264761]: 2026-02-23 09:35:19.168329201 +0000 UTC m=+0.127535468 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:35:19 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:35:19 np0005626463.localdomain sudo[264836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdybcqguvldjwyjpobifcwrrztnzlvui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839318.806457-1326-216739614844479/AnsiballZ_copy.py
Feb 23 09:35:19 np0005626463.localdomain sudo[264836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:19 np0005626463.localdomain python3.9[264838]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839318.806457-1326-216739614844479/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:19 np0005626463.localdomain sudo[264836]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:19 np0005626463.localdomain sudo[264891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnvigmscvvneqwvrtyfxdjmqhzjqyeyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839318.806457-1326-216739614844479/AnsiballZ_systemd.py
Feb 23 09:35:19 np0005626463.localdomain sudo[264891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:19 np0005626463.localdomain python3.9[264893]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:35:20 np0005626463.localdomain systemd-rc-local-generator[264919]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:35:20 np0005626463.localdomain systemd-sysv-generator[264922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:20 np0005626463.localdomain sudo[264891]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:20 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:20.448 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:20 np0005626463.localdomain sudo[264982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lisxwgfqihobykwrtvfgvrxqudupgamx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839318.806457-1326-216739614844479/AnsiballZ_systemd.py
Feb 23 09:35:20 np0005626463.localdomain sudo[264982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:20 np0005626463.localdomain sshd[264985]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:35:21 np0005626463.localdomain python3.9[264984]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:35:21 np0005626463.localdomain systemd-rc-local-generator[265011]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:35:21 np0005626463.localdomain systemd-sysv-generator[265017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 23 09:35:21 np0005626463.localdomain sshd[264985]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:35:21 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 09:35:21 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:35:21 np0005626463.localdomain podman[265027]: 2026-02-23 09:35:21.589719672 +0000 UTC m=+0.130415948 container init 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=neutron_dhcp)
Feb 23 09:35:21 np0005626463.localdomain podman[265027]: 2026-02-23 09:35:21.605755924 +0000 UTC m=+0.146452210 container start 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=neutron_dhcp, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:35:21 np0005626463.localdomain podman[265027]: neutron_dhcp_agent
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + sudo -E kolla_set_configs
Feb 23 09:35:21 np0005626463.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 23 09:35:21 np0005626463.localdomain sudo[264982]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Validating config file
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Copying service configuration files
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Writing out command to execute
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: ++ cat /run_command
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + ARGS=
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + sudo kolla_copy_cacerts
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + [[ ! -n '' ]]
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + . kolla_extend_start
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + umask 0022
Feb 23 09:35:21 np0005626463.localdomain neutron_dhcp_agent[265040]: + exec /usr/bin/neutron-dhcp-agent
Feb 23 09:35:22 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:22.247 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:22 np0005626463.localdomain python3.9[265164]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:35:22 np0005626463.localdomain systemd[1]: tmp-crun.VRZ4Vs.mount: Deactivated successfully.
Feb 23 09:35:22 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:22.879 265044 INFO neutron.common.config [-] Logging enabled!
Feb 23 09:35:22 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:22.879 265044 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 23 09:35:23 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.241 265044 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 23 09:35:23 np0005626463.localdomain sudo[265273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubztsnypgpuqhvigshfurkloewqkajic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839323.1285071-1461-223820587238363/AnsiballZ_stat.py
Feb 23 09:35:23 np0005626463.localdomain sudo[265273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:23 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.472 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] All active networks have been fetched through RPC.
Feb 23 09:35:23 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.472 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] Synchronizing state complete
Feb 23 09:35:23 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.546 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] DHCP agent started
Feb 23 09:35:23 np0005626463.localdomain python3.9[265275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:35:23 np0005626463.localdomain sudo[265273]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:23 np0005626463.localdomain sudo[265363]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufexlcpqkusrvmmizprsndrkypvpuclc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839323.1285071-1461-223820587238363/AnsiballZ_copy.py
Feb 23 09:35:23 np0005626463.localdomain sudo[265363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24706 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9DDC00000000001030307) 
Feb 23 09:35:24 np0005626463.localdomain python3.9[265365]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839323.1285071-1461-223820587238363/.source.yaml _original_basename=.v8c6l_mw follow=False checksum=032f1f7e8199faa0c01f5405a803b0de94087c3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:35:24 np0005626463.localdomain sudo[265363]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:24 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:24.288 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:35:24 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:24.289 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:35:24 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:24.291 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:35:24 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:24.325 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:24 np0005626463.localdomain sudo[265473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpvpzvwskjvrsjimsfsyjunabeqnzqxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839324.3905177-1506-57931790900733/AnsiballZ_systemd.py
Feb 23 09:35:24 np0005626463.localdomain sudo[265473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:35:24 np0005626463.localdomain python3.9[265475]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:35:25 np0005626463.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Feb 23 09:35:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24707 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E1C60000000001030307) 
Feb 23 09:35:25 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:25.486 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:25 np0005626463.localdomain neutron_dhcp_agent[265040]: 2026-02-23 09:35:25.620 265044 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 23 09:35:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1612 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E4060000000001030307) 
Feb 23 09:35:25 np0005626463.localdomain systemd[1]: libpod-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8.scope: Deactivated successfully.
Feb 23 09:35:25 np0005626463.localdomain systemd[1]: libpod-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8.scope: Consumed 1.973s CPU time.
Feb 23 09:35:25 np0005626463.localdomain podman[265479]: 2026-02-23 09:35:25.936937598 +0000 UTC m=+0.890465724 container died 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, org.label-schema.license=GPLv2, tcib_managed=true, container_name=neutron_dhcp_agent)
Feb 23 09:35:25 np0005626463.localdomain podman[265479]: 2026-02-23 09:35:25.988477312 +0000 UTC m=+0.942005338 container cleanup 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=neutron_dhcp_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Feb 23 09:35:25 np0005626463.localdomain podman[265479]: neutron_dhcp_agent
Feb 23 09:35:26 np0005626463.localdomain podman[265519]: error opening file `/run/crun/1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8/status`: No such file or directory
Feb 23 09:35:26 np0005626463.localdomain podman[265507]: 2026-02-23 09:35:26.086921406 +0000 UTC m=+0.068602839 container cleanup 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:35:26 np0005626463.localdomain podman[265507]: neutron_dhcp_agent
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c-merged.mount: Deactivated successfully.
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8-userdata-shm.mount: Deactivated successfully.
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:35:26 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 23 09:35:26 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:35:26 np0005626463.localdomain podman[265521]: 2026-02-23 09:35:26.24233993 +0000 UTC m=+0.121380589 container init 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=neutron_dhcp, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:35:26 np0005626463.localdomain podman[265521]: 2026-02-23 09:35:26.251096579 +0000 UTC m=+0.130137238 container start 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:35:26 np0005626463.localdomain podman[265521]: neutron_dhcp_agent
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + sudo -E kolla_set_configs
Feb 23 09:35:26 np0005626463.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 23 09:35:26 np0005626463.localdomain sudo[265473]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Validating config file
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Copying service configuration files
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Writing out command to execute
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: ++ cat /run_command
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + ARGS=
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + sudo kolla_copy_cacerts
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + [[ ! -n '' ]]
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + . kolla_extend_start
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + umask 0022
Feb 23 09:35:26 np0005626463.localdomain neutron_dhcp_agent[265537]: + exec /usr/bin/neutron-dhcp-agent
Feb 23 09:35:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24708 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E9C60000000001030307) 
Feb 23 09:35:27 np0005626463.localdomain sshd[258240]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:35:27 np0005626463.localdomain systemd-logind[759]: Session 58 logged out. Waiting for processes to exit.
Feb 23 09:35:27 np0005626463.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Feb 23 09:35:27 np0005626463.localdomain systemd[1]: session-58.scope: Consumed 34.994s CPU time.
Feb 23 09:35:27 np0005626463.localdomain systemd-logind[759]: Removed session 58.
Feb 23 09:35:27 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:27.294 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:27 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.484 265541 INFO neutron.common.config [-] Logging enabled!
Feb 23 09:35:27 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.484 265541 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 23 09:35:27 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.847 265541 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 23 09:35:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11441 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9EE060000000001030307) 
Feb 23 09:35:28 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.443 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] All active networks have been fetched through RPC.
Feb 23 09:35:28 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.443 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] Synchronizing state complete
Feb 23 09:35:28 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.471 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] DHCP agent started
Feb 23 09:35:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:35:29 np0005626463.localdomain podman[265570]: 2026-02-23 09:35:29.917968818 +0000 UTC m=+0.088281862 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:35:29 np0005626463.localdomain podman[265570]: 2026-02-23 09:35:29.957319487 +0000 UTC m=+0.127632511 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, version=9.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:35:29 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:35:30 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:30.488 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24709 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9F9860000000001030307) 
Feb 23 09:35:32 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:32.325 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:35:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:35:34 np0005626463.localdomain systemd[1]: tmp-crun.1D7M7H.mount: Deactivated successfully.
Feb 23 09:35:34 np0005626463.localdomain podman[265592]: 2026-02-23 09:35:34.902285957 +0000 UTC m=+0.073554582 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:35:34 np0005626463.localdomain podman[265592]: 2026-02-23 09:35:34.915385278 +0000 UTC m=+0.086653913 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:35:34 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:35:34 np0005626463.localdomain podman[265591]: 2026-02-23 09:35:34.967916312 +0000 UTC m=+0.140088344 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller)
Feb 23 09:35:35 np0005626463.localdomain podman[265591]: 2026-02-23 09:35:35.004218207 +0000 UTC m=+0.176390239 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 23 09:35:35 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:35:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:35.492 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:37 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:37.372 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:35:37 np0005626463.localdomain podman[265639]: 2026-02-23 09:35:37.899766401 +0000 UTC m=+0.078017277 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 23 09:35:37 np0005626463.localdomain podman[265639]: 2026-02-23 09:35:37.911837632 +0000 UTC m=+0.090088508 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:35:37 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:35:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:35:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:35:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:35:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:35:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:35:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1"
Feb 23 09:35:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24710 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA1A060000000001030307) 
Feb 23 09:35:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:40.495 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:42 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:42.419 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:35:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:35:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:35:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:35:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:35:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:35:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:35:43 np0005626463.localdomain podman[265657]: 2026-02-23 09:35:43.904020519 +0000 UTC m=+0.081642468 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:35:43 np0005626463.localdomain podman[265657]: 2026-02-23 09:35:43.937237355 +0000 UTC m=+0.114859264 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:35:43 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:35:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:45.511 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:47 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:47.453 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:48.535 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:35:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:35:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:35:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:35:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:35:49 np0005626463.localdomain podman[265675]: 2026-02-23 09:35:49.902516802 +0000 UTC m=+0.078417299 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:35:49 np0005626463.localdomain podman[265675]: 2026-02-23 09:35:49.91421779 +0000 UTC m=+0.090118287 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:35:49 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:35:50 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:50.536 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:52.507 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55599 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA52F00000000001030307) 
Feb 23 09:35:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:35:54Z|00051|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Feb 23 09:35:54 np0005626463.localdomain sshd[265698]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:35:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55600 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA57060000000001030307) 
Feb 23 09:35:55 np0005626463.localdomain sshd[265698]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:35:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:55.575 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:55 np0005626463.localdomain sudo[265700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:35:55 np0005626463.localdomain sudo[265700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:35:55 np0005626463.localdomain sudo[265700]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:55 np0005626463.localdomain sudo[265718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:35:55 np0005626463.localdomain sudo[265718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:35:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24711 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA5A060000000001030307) 
Feb 23 09:35:56 np0005626463.localdomain sudo[265718]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:57 np0005626463.localdomain sudo[265768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:35:57 np0005626463.localdomain sudo[265768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:35:57 np0005626463.localdomain sudo[265768]: pam_unix(sudo:session): session closed for user root
Feb 23 09:35:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55601 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA5F060000000001030307) 
Feb 23 09:35:57 np0005626463.localdomain sshd[265786]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:35:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:57.542 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:35:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1613 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA62060000000001030307) 
Feb 23 09:35:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:57.987 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:57.987 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:57 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:57.988 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:35:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:59.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:35:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:59.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:35:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:35:59.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:36:00 np0005626463.localdomain sshd[265786]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:36:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:36:00 np0005626463.localdomain podman[265788]: 2026-02-23 09:36:00.196153001 +0000 UTC m=+0.088141496 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Feb 23 09:36:00 np0005626463.localdomain podman[265788]: 2026-02-23 09:36:00.208678634 +0000 UTC m=+0.100667119 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, maintainer=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, release=1770267347, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-type=git)
Feb 23 09:36:00 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:36:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:36:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:36:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:36:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:00.366 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:36:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:00.579 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55602 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA6EC60000000001030307) 
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.594 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.615 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.616 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.616 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.617 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:01.617 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:02.578 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:02.612 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:02.612 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.562 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.564 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:36:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:05.625 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:36:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:36:05 np0005626463.localdomain podman[265828]: 2026-02-23 09:36:05.909665841 +0000 UTC m=+0.074131547 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:36:05 np0005626463.localdomain podman[265829]: 2026-02-23 09:36:05.967583813 +0000 UTC m=+0.128346616 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:36:05 np0005626463.localdomain podman[265828]: 2026-02-23 09:36:05.975378602 +0000 UTC m=+0.139844388 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:36:05 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.029 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:36:06 np0005626463.localdomain podman[265829]: 2026-02-23 09:36:06.030310021 +0000 UTC m=+0.191072824 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:36:06 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.095 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.096 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.317 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.319 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12160MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.319 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.320 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.402 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.403 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.403 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.439 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.892 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.899 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.917 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.920 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:36:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:06.921 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:36:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:07.623 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:36:08 np0005626463.localdomain podman[265900]: 2026-02-23 09:36:08.911638138 +0000 UTC m=+0.083903617 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:36:08 np0005626463.localdomain podman[265900]: 2026-02-23 09:36:08.921334584 +0000 UTC m=+0.093600073 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 09:36:08 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:36:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:36:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:36:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:36:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:36:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:36:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1"
Feb 23 09:36:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55603 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA90060000000001030307) 
Feb 23 09:36:10 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:10.657 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:12 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:12.665 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:36:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:36:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:36:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:36:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:36:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:36:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:36:14 np0005626463.localdomain podman[265919]: 2026-02-23 09:36:14.906788232 +0000 UTC m=+0.082364570 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 23 09:36:14 np0005626463.localdomain podman[265919]: 2026-02-23 09:36:14.916240021 +0000 UTC m=+0.091816349 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 23 09:36:14 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:36:15 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:15.704 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:17 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:17.705 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:17 np0005626463.localdomain sshd[265937]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:36:18 np0005626463.localdomain sshd[265937]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:36:19 np0005626463.localdomain sshd[265939]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:36:19 np0005626463.localdomain sshd[265939]: Accepted publickey for zuul from 192.168.122.30 port 47432 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:36:19 np0005626463.localdomain systemd-logind[759]: New session 59 of user zuul.
Feb 23 09:36:19 np0005626463.localdomain systemd[1]: Started Session 59 of User zuul.
Feb 23 09:36:19 np0005626463.localdomain sshd[265939]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:36:20 np0005626463.localdomain python3.9[266050]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:36:20 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:20.745 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:36:20 np0005626463.localdomain systemd[1]: tmp-crun.G2FYdZ.mount: Deactivated successfully.
Feb 23 09:36:20 np0005626463.localdomain podman[266055]: 2026-02-23 09:36:20.929018984 +0000 UTC m=+0.091867290 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:36:20 np0005626463.localdomain podman[266055]: 2026-02-23 09:36:20.938230436 +0000 UTC m=+0.101078752 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:36:20 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:36:22 np0005626463.localdomain python3.9[266185]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:36:22 np0005626463.localdomain network[266202]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:36:22 np0005626463.localdomain network[266203]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:36:22 np0005626463.localdomain network[266204]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:36:22 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:22.755 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:23 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:36:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54901 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAC8200000000001030307) 
Feb 23 09:36:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54902 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFACC460000000001030307) 
Feb 23 09:36:25 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:25.771 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:26 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55604 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD0060000000001030307) 
Feb 23 09:36:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54903 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD4460000000001030307) 
Feb 23 09:36:27 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:27.803 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24712 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD8060000000001030307) 
Feb 23 09:36:28 np0005626463.localdomain sudo[266434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opvtzjikmtqoxqhbcsvxgjxvjykkudqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839388.0062222-96-205466963969864/AnsiballZ_setup.py
Feb 23 09:36:28 np0005626463.localdomain sudo[266434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:28 np0005626463.localdomain python3.9[266436]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 23 09:36:28 np0005626463.localdomain sudo[266434]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:29 np0005626463.localdomain sudo[266497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqprtitsrnihiuqlxxxdewlujeijdmni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839388.0062222-96-205466963969864/AnsiballZ_dnf.py
Feb 23 09:36:29 np0005626463.localdomain sudo[266497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:29 np0005626463.localdomain python3.9[266499]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:36:30 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:30.810 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:36:30 np0005626463.localdomain podman[266502]: 2026-02-23 09:36:30.894038185 +0000 UTC m=+0.066485134 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 23 09:36:30 np0005626463.localdomain podman[266502]: 2026-02-23 09:36:30.906199847 +0000 UTC m=+0.078646826 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:36:30 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:36:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54904 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAE4060000000001030307) 
Feb 23 09:36:32 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:32.837 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:33 np0005626463.localdomain sudo[266497]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:33 np0005626463.localdomain sudo[266629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkpojphdprwhfyclifvzxfdvjpaxyejl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839393.284833-132-225670402326537/AnsiballZ_stat.py
Feb 23 09:36:33 np0005626463.localdomain sudo[266629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:33 np0005626463.localdomain sshd[266632]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:36:33 np0005626463.localdomain python3.9[266631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:36:33 np0005626463.localdomain sudo[266629]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:33 np0005626463.localdomain sshd[266651]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:36:34 np0005626463.localdomain sshd[266632]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:36:34 np0005626463.localdomain sudo[266742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoddxexmcdyjjgkxfqemlvsowcpggvti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839394.1832175-162-105873923140360/AnsiballZ_command.py
Feb 23 09:36:34 np0005626463.localdomain sudo[266742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:34 np0005626463.localdomain python3.9[266744]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:36:34 np0005626463.localdomain sudo[266742]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:35 np0005626463.localdomain sudo[266854]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbtbpvtgmkuusgmjvtwoesytzkdsupez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839395.1193433-192-123674007836915/AnsiballZ_stat.py
Feb 23 09:36:35 np0005626463.localdomain sudo[266854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:35 np0005626463.localdomain python3.9[266856]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:36:35 np0005626463.localdomain sudo[266854]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:35 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:35.844 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:36 np0005626463.localdomain sshd[266651]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:36:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:36:36 np0005626463.localdomain systemd[1]: tmp-crun.9azg4N.mount: Deactivated successfully.
Feb 23 09:36:36 np0005626463.localdomain podman[266896]: 2026-02-23 09:36:36.125224225 +0000 UTC m=+0.083772343 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:36:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:36:36 np0005626463.localdomain podman[266896]: 2026-02-23 09:36:36.187021295 +0000 UTC m=+0.145569383 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:36:36 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:36:36 np0005626463.localdomain podman[266932]: 2026-02-23 09:36:36.273279173 +0000 UTC m=+0.127890202 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:36:36 np0005626463.localdomain podman[266932]: 2026-02-23 09:36:36.306939102 +0000 UTC m=+0.161550111 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:36:36 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:36:36 np0005626463.localdomain sudo[267012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nigbxwlaqkhsfumngjgtbcpgmrzsivww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839396.0126865-225-245561460933278/AnsiballZ_lineinfile.py
Feb 23 09:36:36 np0005626463.localdomain sudo[267012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:36 np0005626463.localdomain python3.9[267014]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:36:36 np0005626463.localdomain sudo[267012]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:37 np0005626463.localdomain sudo[267122]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dggitnrkfopbzruesykounthcgoatgut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839396.9254878-252-956185119876/AnsiballZ_systemd_service.py
Feb 23 09:36:37 np0005626463.localdomain sudo[267122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:37 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:37.896 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:37 np0005626463.localdomain python3.9[267124]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:36:38 np0005626463.localdomain sudo[267122]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:38 np0005626463.localdomain sudo[267234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecibdqwvwfxdaytbnuugfvjjcjjnmetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839398.2397363-276-263957615967827/AnsiballZ_systemd_service.py
Feb 23 09:36:38 np0005626463.localdomain sudo[267234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:38 np0005626463.localdomain python3.9[267236]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:36:38 np0005626463.localdomain sudo[267234]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:36:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:36:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:36:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:36:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54905 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB04070000000001030307) 
Feb 23 09:36:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:36:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16788 "" "Go-http-client/1.1"
Feb 23 09:36:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:36:39 np0005626463.localdomain podman[267347]: 2026-02-23 09:36:39.908818894 +0000 UTC m=+0.080216484 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 23 09:36:39 np0005626463.localdomain podman[267347]: 2026-02-23 09:36:39.916172099 +0000 UTC m=+0.087569739 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:36:39 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:36:40 np0005626463.localdomain python3.9[267346]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:36:40 np0005626463.localdomain network[267383]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:36:40 np0005626463.localdomain network[267384]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:36:40 np0005626463.localdomain network[267385]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:36:40 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:40.885 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:42 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:36:42 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:42.899 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:36:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:36:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:36:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:36:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:36:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:36:45 np0005626463.localdomain sudo[267615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgsstkycqgrikiyxuoxjosigfxtlhpil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839405.396236-345-257891390581097/AnsiballZ_dnf.py
Feb 23 09:36:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:36:45 np0005626463.localdomain sudo[267615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:45 np0005626463.localdomain podman[267617]: 2026-02-23 09:36:45.802415741 +0000 UTC m=+0.081099612 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:36:45 np0005626463.localdomain podman[267617]: 2026-02-23 09:36:45.807834365 +0000 UTC m=+0.086518236 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:36:45 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:36:45 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:45.921 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:45 np0005626463.localdomain python3.9[267618]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:36:47 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:47.938 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:36:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:36:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:36:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:36:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:36:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:36:49 np0005626463.localdomain sudo[267615]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:49 np0005626463.localdomain sudo[267746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aagkacusmqxocvwsvxhrgxjeqmoqhsfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839409.3613052-372-15843418903574/AnsiballZ_file.py
Feb 23 09:36:49 np0005626463.localdomain sudo[267746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:50 np0005626463.localdomain python3.9[267748]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 09:36:50 np0005626463.localdomain sudo[267746]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:50 np0005626463.localdomain sudo[267856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwvmjpwkyuisoswysherlaxecbqrlaeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839410.241934-396-16442609454726/AnsiballZ_modprobe.py
Feb 23 09:36:50 np0005626463.localdomain sudo[267856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:50 np0005626463.localdomain python3.9[267858]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 23 09:36:50 np0005626463.localdomain sudo[267856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:50 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:50.954 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:36:51 np0005626463.localdomain podman[267876]: 2026-02-23 09:36:51.908509876 +0000 UTC m=+0.083287389 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:36:51 np0005626463.localdomain podman[267876]: 2026-02-23 09:36:51.945217748 +0000 UTC m=+0.119995251 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:36:51 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:36:52 np0005626463.localdomain sudo[267987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjjorwvkssjyjzgqsodyzqdbngmjhbrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839412.1188815-420-165810918318039/AnsiballZ_stat.py
Feb 23 09:36:52 np0005626463.localdomain sudo[267987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:52 np0005626463.localdomain python3.9[267989]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:36:52 np0005626463.localdomain sudo[267987]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:52 np0005626463.localdomain sudo[268044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpxftvgfxlxnvakaoptvdljqtlbiztqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839412.1188815-420-165810918318039/AnsiballZ_file.py
Feb 23 09:36:52 np0005626463.localdomain sudo[268044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:52 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:52.980 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:52 np0005626463.localdomain python3.9[268046]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:36:53 np0005626463.localdomain sudo[268044]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:53 np0005626463.localdomain sudo[268154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vznsnrvnxifdosopdumfcgcjiogkwqxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839413.376559-459-107140990976026/AnsiballZ_lineinfile.py
Feb 23 09:36:53 np0005626463.localdomain sudo[268154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:53 np0005626463.localdomain python3.9[268156]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:36:53 np0005626463.localdomain sudo[268154]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30275 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB3D500000000001030307) 
Feb 23 09:36:54 np0005626463.localdomain sudo[268264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekrbezzfrfqhaemjcsxdmbmqcuzugidq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839414.5362597-486-130663130661330/AnsiballZ_command.py
Feb 23 09:36:54 np0005626463.localdomain sudo[268264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:55 np0005626463.localdomain python3.9[268266]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:36:55 np0005626463.localdomain sudo[268264]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30276 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB41460000000001030307) 
Feb 23 09:36:55 np0005626463.localdomain sudo[268375]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jskrlgujjmnjsgteurgaakfffqmkcxgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839415.2074938-510-247425909728555/AnsiballZ_command.py
Feb 23 09:36:55 np0005626463.localdomain sudo[268375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:55 np0005626463.localdomain python3.9[268377]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:36:55 np0005626463.localdomain sudo[268375]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54906 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB44090000000001030307) 
Feb 23 09:36:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:55.999 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.135 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fe22e74-0508-42eb-ad8e-0a07554970cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.131564', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '317ba12c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'fb630a78feef953a4ba356e4d7faf26d2010e253846eaadba2358a5b7506847a'}]}, 'timestamp': '2026-02-23 09:36:56.135927', '_unique_id': 'cc23c1c32f82485bb69337d6a2196cb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.137 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.156 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 56720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0d8ff29-4b95-4d8b-87cf-66da369dde2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56720000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:36:56.137917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '317ec47e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.345447972, 'message_signature': '5ac2c43f68c02900123c6c629c3d47e258cb1182feed293c338325cb9a94dec1'}]}, 'timestamp': '2026-02-23 09:36:56.156501', '_unique_id': 'bb27c4c44556485e92f3004f08973ef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.158 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.158 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '226a184a-41d4-48fb-bd46-3cc026909eee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:36:56.158179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '317f14ec-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.345447972, 'message_signature': '173dcc04f6638a6fccb6cd4c2fad42b79757ae87e31d04b20681ef12f9902439'}]}, 'timestamp': '2026-02-23 09:36:56.158476', '_unique_id': '97a6b95d8dde455482fd8dc4b7632f98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6efae7f-da51-4507-b947-20c1e3cea2a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9662, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.159857', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '317f5808-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'a24695fb0c128c648c73ede0e5f93cc9a482eecc3041589485f1a9667915cb2e'}]}, 'timestamp': '2026-02-23 09:36:56.160206', '_unique_id': '43cc0b16e18d4c54b383ce44a63a2b22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac28fa4a-32c7-4470-9fce-197f36a89acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.161572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '31815e28-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '0280e6b6ac3c35500924e80018f526705465bb67a664841269297b7c02fe7f35'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.161572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '31816b02-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '0a6443ccc114e2e682604b91a51c8941113946a941ff1d412e832a6b4e9cf4fc'}]}, 'timestamp': '2026-02-23 09:36:56.173780', '_unique_id': '7c068d6c3496411ba6adcc155fa47a9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21c74457-8456-4de2-a053-8b26bb1b166d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.175280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3181b094-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '43975f29dafb21da3f6d86ec1bb9f329dd47467025c160e290851478384c3215'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.175280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3181baf8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '2ffecd826bd9a56fde96f6bd09282b7ef260e6e6d78374221ae94c6b595af5c1'}]}, 'timestamp': '2026-02-23 09:36:56.175820', '_unique_id': '26c08b6162f94b83b89d0795d19bad6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2922c7ca-f3b1-4b92-902a-e8b2bef52a63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.177411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '31886ba0-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'b0fe6b8e261325fe0dd84e42baadffda3914e1845ee0f3e19d9047112205141a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.177411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318884d2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '047027b5330487789cd3e21e63927ed84252428d9dc8a4039bb0142e7c792baf'}]}, 'timestamp': '2026-02-23 09:36:56.220444', '_unique_id': '95ba58919d08417fb41a5abcd7a44cf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df77c528-cfce-4d25-aae7-5f118ad85d49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.224035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318929be-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '75c6c27252c919b88f04cd25541b0239b341589566f50bc2f8cc81edb428d38e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.224035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '31894110-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '7e6a761191f0965ece0c0a9167eeaf8030c8af5a0bffb36758732f1a98f041c8'}]}, 'timestamp': '2026-02-23 09:36:56.225246', '_unique_id': '601809edf277496ca4f8c4105d881e70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7902fe61-2462-4e0c-bc50-8297523882dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.227924', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '3189c1bc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '9b606888046d2a0c9b3c25c5c13c26650dbcb213fd88e79aafde48a8f01c54e9'}]}, 'timestamp': '2026-02-23 09:36:56.228585', '_unique_id': '60278f4e1bbc48878851076353cccb59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.230 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f54dd6f-5291-43d8-8ddf-053f326b2ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.231438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318a4a38-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '36c7c79255d5c542c51026ead07a870f932a623c9086e91cf86a7a581d0413f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.231438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318a7080-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'e05eb23970af9eb18e69e0bc8f07600b598952bb8a6f6bda9c4fd5b16cca6f56'}]}, 'timestamp': '2026-02-23 09:36:56.233064', '_unique_id': 'fc14d2b0fc144d3e94038d1486d30876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7c3fc40-e31c-411f-b12e-6387c6a42168', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.235627', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318aeef2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '6cc4920a3fe26ff32b15b6f63d26912a6342f392641ed377f80b504d886c024c'}]}, 'timestamp': '2026-02-23 09:36:56.236298', '_unique_id': '47b569c1f10b4206aca46079ef81fe4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa262652-8e53-467e-bef3-0ce2f344ded9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.238812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318b6b98-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '3c74dc0837d2938b893d1a8e104f2371e4407c2ffce6532efeb565f3feee78f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.238812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318b824a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'cd783402eb565a80a5d0b47d1adfe563fd6672fab3946e055d0bbd3c846a9e30'}]}, 'timestamp': '2026-02-23 09:36:56.240068', '_unique_id': 'bad394cd48bd4a51932cb0a6a3163507'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c63b1f37-3016-443d-ba41-0649b0e00eea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.242344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318bee56-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '9428253c37a852e4953ef05e8fc314533deb1ddc3d06f019b664cd6e2a7180bc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.242344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318bf9dc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': 'be27f77adda33f5d3ccaf9502ad0bea85a595e78993caa8b0e993a6cc69c28e8'}]}, 'timestamp': '2026-02-23 09:36:56.243009', '_unique_id': '4f829810c73d4490ad68e333a2d4b6f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '432e7c8f-6c13-48c9-91c8-e804cbc3a987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.244460', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318c4130-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '83ada7970a2993bf8649c5223ab2450db320ec59982c63b4c9fe678db2f94597'}]}, 'timestamp': '2026-02-23 09:36:56.244838', '_unique_id': '8d8e1f9c01ef4f5fad00be085cac3a2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9de104e-cc4e-43b1-8dfb-940af9885796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.246398', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318c8ce4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'df4e76b5a0ab4d2ecea552d432a9586d17c6608c6e6100230fe37ef37cdb10b0'}]}, 'timestamp': '2026-02-23 09:36:56.246789', '_unique_id': '9671f14319f94c6588f94ff7fbabfe34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd32bf8b6-5923-4bf5-92ae-072e47e52b3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.248420', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318cdbf4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '0c90fb3dd1279ddb3cd12e92550cc34ece2200f4ed195722bcfcb097895f225c'}]}, 'timestamp': '2026-02-23 09:36:56.248840', '_unique_id': 'fa95ed79b77448368e76daf190e8049e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f86618d-3f1a-41cb-b28d-a43dd677034d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.250663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318d34aa-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'ffcd6335e1717779e745919d695d92f070a69fcea0f840818412cb22b74af5b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.250663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318d4382-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '1e0370029bdad6b1fe409f1f7c7541232c088fe4c6b4344150a929911d01aa47'}]}, 'timestamp': '2026-02-23 09:36:56.251462', '_unique_id': 'c0488c43a37f484788e9518f1d3fac46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 92 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0994a540-78b4-4256-96e2-cb73d3f75f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 92, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.253333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318d9ce2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '40cb207035d8fa98d3f89ad61957ddff35c0d463c7c97e675efd6b9101d0e819'}]}, 'timestamp': '2026-02-23 09:36:56.253779', '_unique_id': '29ce9f392f7c4bc584ff07ba7e43d5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac8668db-7d11-425a-a30e-328e24837146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.255599', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318df44e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '75eff5a9e9d8945a7261784ccf0e373a4044123ca66ef40822fa00e35010759f'}]}, 'timestamp': '2026-02-23 09:36:56.256047', '_unique_id': '48cd00eb5a9143e9a0a0d0ffe12a6aba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cef4dceb-c1f4-4b82-855b-0c8f47b1bc66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.257829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318e4c8c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '85a65c02cdf63765e8ae780a0d8cd0862a94c1a970f7614fd14c024852459599'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.257829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318e5b0a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '034a3935a7f6a3f0d62760107d3cc0aafab8f67d82244b0075e5f5cfabd582c9'}]}, 'timestamp': '2026-02-23 09:36:56.258623', '_unique_id': '4dc6e4a167a347708528b5cf671f819f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41573a41-d2ee-4cad-8844-c41dbfe0b498', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.260526', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318eb4ce-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '37c97b33f8c6a2f06be72f011c28294474f7ce7afa4e3cbc5b518c4893eb1c91'}]}, 'timestamp': '2026-02-23 09:36:56.260968', '_unique_id': 'f70031d580944536b5d9b90ec0b164db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:36:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:36:56 np0005626463.localdomain sudo[268486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jicwlmsyrqzyygqiolaqyeyqsbecwsco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839416.0327952-537-54865265794609/AnsiballZ_stat.py
Feb 23 09:36:56 np0005626463.localdomain sudo[268486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:56 np0005626463.localdomain python3.9[268488]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:36:56 np0005626463.localdomain sudo[268486]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30277 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB49470000000001030307) 
Feb 23 09:36:57 np0005626463.localdomain sudo[268491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:36:57 np0005626463.localdomain sudo[268491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:36:57 np0005626463.localdomain sudo[268491]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:57 np0005626463.localdomain sudo[268512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:36:57 np0005626463.localdomain sudo[268512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:36:57 np0005626463.localdomain sudo[268653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbbtpawneqvnzaoaqlozvrcebmtjmtsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839417.5668926-567-212567737447121/AnsiballZ_command.py
Feb 23 09:36:57 np0005626463.localdomain sudo[268653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:57 np0005626463.localdomain sudo[268512]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:58.022 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:36:58 np0005626463.localdomain python3.9[268656]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:36:58 np0005626463.localdomain sudo[268653]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55605 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB4E060000000001030307) 
Feb 23 09:36:58 np0005626463.localdomain sudo[268726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:36:58 np0005626463.localdomain sudo[268726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:36:58 np0005626463.localdomain sudo[268726]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:59 np0005626463.localdomain sudo[268796]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvgiefgihgfreblohjpryyeaysfefhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839418.4154673-597-106123439474966/AnsiballZ_replace.py
Feb 23 09:36:59 np0005626463.localdomain sudo[268796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:36:59 np0005626463.localdomain python3.9[268798]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:36:59 np0005626463.localdomain sudo[268796]: pam_unix(sudo:session): session closed for user root
Feb 23 09:36:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:59.921 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:59.922 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:36:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:36:59.922 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:37:00 np0005626463.localdomain sudo[268906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hualpgqbuveslsewtqqdirnspjiikzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839420.021816-624-44721208616705/AnsiballZ_lineinfile.py
Feb 23 09:37:00 np0005626463.localdomain sudo[268906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:00 np0005626463.localdomain python3.9[268908]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:00 np0005626463.localdomain sudo[268906]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:37:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:37:00 np0005626463.localdomain sudo[269016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzrackjggwzjylfydfoiyawaijfwbwnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839420.5778053-624-266394158490399/AnsiballZ_lineinfile.py
Feb 23 09:37:00 np0005626463.localdomain sudo[269016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:01 np0005626463.localdomain python3.9[269018]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:01 np0005626463.localdomain sudo[269016]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:01.042 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30278 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB59060000000001030307) 
Feb 23 09:37:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:01.423 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:37:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:37:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:37:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:37:01 np0005626463.localdomain sudo[269126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqyladubwpvwysjadrstbbonlnevaotq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839421.1823926-624-199792077832147/AnsiballZ_lineinfile.py
Feb 23 09:37:01 np0005626463.localdomain sudo[269126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:37:01 np0005626463.localdomain podman[269129]: 2026-02-23 09:37:01.596433269 +0000 UTC m=+0.087059513 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:37:01 np0005626463.localdomain podman[269129]: 2026-02-23 09:37:01.634662198 +0000 UTC m=+0.125288422 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1770267347, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Feb 23 09:37:01 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:37:01 np0005626463.localdomain python3.9[269128]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:01 np0005626463.localdomain sudo[269126]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:02 np0005626463.localdomain sudo[269257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkgggfdrcsoekkfnsxutnikewswkywdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839421.827565-624-80804305677864/AnsiballZ_lineinfile.py
Feb 23 09:37:02 np0005626463.localdomain sudo[269257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:02 np0005626463.localdomain python3.9[269259]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:02 np0005626463.localdomain sudo[269257]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:02.614 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:37:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:02.635 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:37:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:02.636 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:37:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:02.636 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:02.637 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:03.063 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:03 np0005626463.localdomain sudo[269367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlbcaxqomucmatsjflklbveybbmnphds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839422.6711578-711-245891666387625/AnsiballZ_stat.py
Feb 23 09:37:03 np0005626463.localdomain sudo[269367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:03 np0005626463.localdomain python3.9[269369]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:37:03 np0005626463.localdomain sudo[269367]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:03.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:04 np0005626463.localdomain sudo[269479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erlcecdfpkvhmsyepylfcgjaonfqivhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839423.7722664-741-54400977894806/AnsiballZ_systemd_service.py
Feb 23 09:37:04 np0005626463.localdomain sudo[269479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:04 np0005626463.localdomain python3.9[269481]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:04 np0005626463.localdomain sudo[269479]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:04.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.588 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:37:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:05.588 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:37:05 np0005626463.localdomain sudo[269611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spajyoazvqvkdqckdkqgqiuznqtfmwea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839425.6308749-765-66705733151190/AnsiballZ_systemd_service.py
Feb 23 09:37:05 np0005626463.localdomain sudo[269611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.006 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.044 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.075 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.076 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:37:06 np0005626463.localdomain python3.9[269613]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:37:06 np0005626463.localdomain sudo[269611]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.279 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.281 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12159MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.282 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.282 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:37:06 np0005626463.localdomain systemd[1]: tmp-crun.jLeBKs.mount: Deactivated successfully.
Feb 23 09:37:06 np0005626463.localdomain podman[269617]: 2026-02-23 09:37:06.341249355 +0000 UTC m=+0.083918557 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.345 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.346 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.346 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:37:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:37:06 np0005626463.localdomain podman[269617]: 2026-02-23 09:37:06.38030767 +0000 UTC m=+0.122976912 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.380 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:37:06 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:37:06 np0005626463.localdomain podman[269654]: 2026-02-23 09:37:06.438991624 +0000 UTC m=+0.071812597 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:37:06 np0005626463.localdomain podman[269654]: 2026-02-23 09:37:06.451085344 +0000 UTC m=+0.083906327 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:37:06 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.812 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.819 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.836 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.839 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:37:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:06.839 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:37:07 np0005626463.localdomain sudo[269791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gusbkgtarqfzzcrwehebgoprtcbjenit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839426.7989688-801-141369712080792/AnsiballZ_file.py
Feb 23 09:37:07 np0005626463.localdomain sudo[269791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:07 np0005626463.localdomain python3.9[269793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 23 09:37:07 np0005626463.localdomain sudo[269791]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:07 np0005626463.localdomain sudo[269901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvjdbafbjftjmzfvlsgzrhkvozcqrstb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839427.412456-825-251106771138406/AnsiballZ_modprobe.py
Feb 23 09:37:07 np0005626463.localdomain sudo[269901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:07 np0005626463.localdomain sshd[269904]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:37:07 np0005626463.localdomain python3.9[269903]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 23 09:37:07 np0005626463.localdomain sudo[269901]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:08 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:08.099 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:08 np0005626463.localdomain sshd[269904]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:37:08 np0005626463.localdomain sudo[270013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijonfkbzrqlbatxugbipqkcmqtzheicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839428.06977-849-91355390511568/AnsiballZ_stat.py
Feb 23 09:37:08 np0005626463.localdomain sudo[270013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:08 np0005626463.localdomain python3.9[270015]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:37:08 np0005626463.localdomain sudo[270013]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:08 np0005626463.localdomain sudo[270070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaunsrtfessgywoxpxvqjvjikfyjhree ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839428.06977-849-91355390511568/AnsiballZ_file.py
Feb 23 09:37:08 np0005626463.localdomain sudo[270070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:09 np0005626463.localdomain python3.9[270072]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:09 np0005626463.localdomain sudo[270070]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:37:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:37:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:37:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:37:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:37:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1"
Feb 23 09:37:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30279 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB7A060000000001030307) 
Feb 23 09:37:09 np0005626463.localdomain sudo[270180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcfsjnhwsdxjidjknimomhhpenfgsjpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839429.3754091-888-61376463628930/AnsiballZ_lineinfile.py
Feb 23 09:37:09 np0005626463.localdomain sudo[270180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:09 np0005626463.localdomain python3.9[270182]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:09 np0005626463.localdomain sudo[270180]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:10 np0005626463.localdomain sudo[270290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqbksrbnmwgibgckqaqufpqmiqqqtilz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839430.196586-915-219251019216363/AnsiballZ_dnf.py
Feb 23 09:37:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:37:10 np0005626463.localdomain sudo[270290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:10 np0005626463.localdomain podman[270292]: 2026-02-23 09:37:10.589240898 +0000 UTC m=+0.081803143 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:37:10 np0005626463.localdomain podman[270292]: 2026-02-23 09:37:10.627969172 +0000 UTC m=+0.120531357 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:37:10 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:37:10 np0005626463.localdomain python3.9[270293]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 23 09:37:11 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:11.080 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:13 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:13.132 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:37:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:37:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:37:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:37:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:37:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:37:14 np0005626463.localdomain sudo[270290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:14 np0005626463.localdomain sshd[270386]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:37:14 np0005626463.localdomain python3.9[270424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 23 09:37:14 np0005626463.localdomain sshd[270386]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:37:15 np0005626463.localdomain sudo[270536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnhzlagzaxmmmimxdowdzmhvjcosqgqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839435.3318915-967-236905789197193/AnsiballZ_file.py
Feb 23 09:37:15 np0005626463.localdomain sudo[270536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:15 np0005626463.localdomain python3.9[270538]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:15 np0005626463.localdomain sudo[270536]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:16 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:16.117 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:16 np0005626463.localdomain sshd[270630]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:37:16 np0005626463.localdomain sudo[270647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nktbpiibbzcdqufgxazgrrhftwuqxrts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839436.3645551-1000-102275590932030/AnsiballZ_systemd_service.py
Feb 23 09:37:16 np0005626463.localdomain sudo[270647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:37:16 np0005626463.localdomain podman[270651]: 2026-02-23 09:37:16.736571584 +0000 UTC m=+0.087172157 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:37:16 np0005626463.localdomain podman[270651]: 2026-02-23 09:37:16.768083187 +0000 UTC m=+0.118683740 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 09:37:16 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:37:16 np0005626463.localdomain python3.9[270650]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:37:16 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:37:17 np0005626463.localdomain systemd-rc-local-generator[270693]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:37:17 np0005626463.localdomain systemd-sysv-generator[270698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:17 np0005626463.localdomain sudo[270647]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:17 np0005626463.localdomain sshd[270630]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:37:17 np0005626463.localdomain python3.9[270812]: ansible-ansible.builtin.service_facts Invoked
Feb 23 09:37:17 np0005626463.localdomain network[270829]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 23 09:37:17 np0005626463.localdomain network[270830]: 'network-scripts' will be removed from distribution in near future.
Feb 23 09:37:17 np0005626463.localdomain network[270831]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 23 09:37:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:18.177 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:19 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:37:21 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:21.155 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:37:22 np0005626463.localdomain podman[270971]: 2026-02-23 09:37:22.911067172 +0000 UTC m=+0.082826864 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:37:22 np0005626463.localdomain podman[270971]: 2026-02-23 09:37:22.923831073 +0000 UTC m=+0.095590825 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:37:22 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:37:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:23.206 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:23 np0005626463.localdomain sudo[271082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrvambtmqjkqpfmyrpembpmjbenwioyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839443.66988-1057-237629272479601/AnsiballZ_systemd_service.py
Feb 23 09:37:23 np0005626463.localdomain sudo[271082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29761 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBB2800000000001030307) 
Feb 23 09:37:24 np0005626463.localdomain python3.9[271084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:24 np0005626463.localdomain sudo[271082]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:24 np0005626463.localdomain sudo[271193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgkodqxsunplicoudyjsvukbwszltkta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839444.361965-1057-129939273726920/AnsiballZ_systemd_service.py
Feb 23 09:37:24 np0005626463.localdomain sudo[271193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:24 np0005626463.localdomain python3.9[271195]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:24 np0005626463.localdomain sudo[271193]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29762 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBB6860000000001030307) 
Feb 23 09:37:26 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30280 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBBA060000000001030307) 
Feb 23 09:37:26 np0005626463.localdomain sudo[271304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uckkjpnjdjvumukjalnclfhzibkfwmoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839445.8084247-1057-87877282440823/AnsiballZ_systemd_service.py
Feb 23 09:37:26 np0005626463.localdomain sudo[271304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:26 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:26.185 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:26 np0005626463.localdomain python3.9[271306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:26 np0005626463.localdomain sudo[271304]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:26 np0005626463.localdomain sudo[271415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwpzmpareewsbcbqtbpiqjmryjyajlxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839446.4739153-1057-41275807434856/AnsiballZ_systemd_service.py
Feb 23 09:37:26 np0005626463.localdomain sudo[271415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:27 np0005626463.localdomain python3.9[271417]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:27 np0005626463.localdomain sudo[271415]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29763 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBBE870000000001030307) 
Feb 23 09:37:27 np0005626463.localdomain sudo[271526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dirmknxyzqucyjfcalhmlxbgdphyhscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839447.1816113-1057-242052740376340/AnsiballZ_systemd_service.py
Feb 23 09:37:27 np0005626463.localdomain sudo[271526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:27 np0005626463.localdomain python3.9[271528]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:27 np0005626463.localdomain sudo[271526]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54907 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBC2060000000001030307) 
Feb 23 09:37:28 np0005626463.localdomain sudo[271637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qazlyixaczmzodmdcgeasawqgqjqihzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839447.9012496-1057-47975179279846/AnsiballZ_systemd_service.py
Feb 23 09:37:28 np0005626463.localdomain sudo[271637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:28 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:28.243 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:28 np0005626463.localdomain python3.9[271639]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:28 np0005626463.localdomain sudo[271637]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:28 np0005626463.localdomain sudo[271748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfwpwgrmwksfhoxunqhatucsxjqhdfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839448.614189-1057-244211883287932/AnsiballZ_systemd_service.py
Feb 23 09:37:28 np0005626463.localdomain sudo[271748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:29 np0005626463.localdomain python3.9[271750]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:29 np0005626463.localdomain sudo[271748]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:29 np0005626463.localdomain sudo[271859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxtwmrjalaqbnlfzlkyddvrkefxlyjgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839449.3847864-1057-3413665528999/AnsiballZ_systemd_service.py
Feb 23 09:37:29 np0005626463.localdomain sudo[271859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:29 np0005626463.localdomain python3.9[271861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:37:31 np0005626463.localdomain sudo[271859]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29764 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBCE460000000001030307) 
Feb 23 09:37:31 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:31.229 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:31 np0005626463.localdomain sudo[271970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aynlivkgvdvbbvumajqbknngqgtpgfun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839451.4803216-1234-84947027563736/AnsiballZ_file.py
Feb 23 09:37:31 np0005626463.localdomain sudo[271970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:37:31 np0005626463.localdomain podman[271973]: 2026-02-23 09:37:31.856896522 +0000 UTC m=+0.086381352 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Feb 23 09:37:31 np0005626463.localdomain podman[271973]: 2026-02-23 09:37:31.872279703 +0000 UTC m=+0.101764553 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:37:31 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:37:31 np0005626463.localdomain python3.9[271972]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:31 np0005626463.localdomain sudo[271970]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:32 np0005626463.localdomain sudo[272100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnzutwyhnoteigkannrndatawrzlswdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839452.238081-1234-35167786571091/AnsiballZ_file.py
Feb 23 09:37:32 np0005626463.localdomain sudo[272100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:32 np0005626463.localdomain python3.9[272102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:32 np0005626463.localdomain sudo[272100]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:32 np0005626463.localdomain sudo[272210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgejegomaqakxwvpbylzrasturpvveki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839452.7516596-1234-281461623314510/AnsiballZ_file.py
Feb 23 09:37:32 np0005626463.localdomain sudo[272210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:33 np0005626463.localdomain python3.9[272212]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:33 np0005626463.localdomain sudo[272210]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:33 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:33.281 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:33 np0005626463.localdomain sudo[272320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcwotopbitjfwzfmzwexkuxuaobhtzac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839453.35795-1234-63580082787114/AnsiballZ_file.py
Feb 23 09:37:33 np0005626463.localdomain sudo[272320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:33 np0005626463.localdomain python3.9[272322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:33 np0005626463.localdomain sudo[272320]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:34 np0005626463.localdomain sudo[272430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-airwhvpkcyvitcpnhtqanlryihogidpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839454.6762037-1234-185937702822845/AnsiballZ_file.py
Feb 23 09:37:34 np0005626463.localdomain sudo[272430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:35 np0005626463.localdomain python3.9[272432]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:35 np0005626463.localdomain sudo[272430]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:35 np0005626463.localdomain sudo[272540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqbpebuaxelmqacelmlhujgdmeemxiqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839455.2643332-1234-106635662528902/AnsiballZ_file.py
Feb 23 09:37:35 np0005626463.localdomain sudo[272540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:35 np0005626463.localdomain python3.9[272542]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:35 np0005626463.localdomain sudo[272540]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:36 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:36.229 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:36 np0005626463.localdomain sudo[272650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdqtzznmnqqfrjuhgaluraoumsyatrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839455.8657146-1234-34282648295898/AnsiballZ_file.py
Feb 23 09:37:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:37:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:37:36 np0005626463.localdomain sudo[272650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:36 np0005626463.localdomain podman[272653]: 2026-02-23 09:37:36.896904315 +0000 UTC m=+0.078294134 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:37:36 np0005626463.localdomain podman[272653]: 2026-02-23 09:37:36.910313345 +0000 UTC m=+0.091703224 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:37:36 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:37:37 np0005626463.localdomain podman[272652]: 2026-02-23 09:37:37.001807144 +0000 UTC m=+0.186995120 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 09:37:37 np0005626463.localdomain python3.9[272654]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:37 np0005626463.localdomain sudo[272650]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:37 np0005626463.localdomain podman[272652]: 2026-02-23 09:37:37.073346141 +0000 UTC m=+0.258534177 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:37:37 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:37:37 np0005626463.localdomain sudo[272809]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atcrjcwdjpopxhtttulsysqpwcjiyzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839457.1428509-1234-92517442031615/AnsiballZ_file.py
Feb 23 09:37:37 np0005626463.localdomain sudo[272809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:37 np0005626463.localdomain python3.9[272811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:37 np0005626463.localdomain sudo[272809]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:38 np0005626463.localdomain sudo[272919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mneyeucruxvtovqoeyaluxtolhnwpzfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839458.0377605-1405-219464026451130/AnsiballZ_file.py
Feb 23 09:37:38 np0005626463.localdomain sudo[272919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:38 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:38.311 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:38 np0005626463.localdomain python3.9[272921]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:38 np0005626463.localdomain sudo[272919]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:38 np0005626463.localdomain sudo[273029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuaikypghbcrrebmqiainahemvbpibmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839458.6056645-1405-170487688626335/AnsiballZ_file.py
Feb 23 09:37:38 np0005626463.localdomain sudo[273029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:39 np0005626463.localdomain python3.9[273031]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:39 np0005626463.localdomain sudo[273029]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29765 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBEE060000000001030307) 
Feb 23 09:37:39 np0005626463.localdomain sudo[273139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwuwvjsacazhedwwhtamiezgxhxhwtcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839459.1590056-1405-190608493760005/AnsiballZ_file.py
Feb 23 09:37:39 np0005626463.localdomain sudo[273139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:37:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:37:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:37:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:37:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:37:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16793 "" "Go-http-client/1.1"
Feb 23 09:37:39 np0005626463.localdomain python3.9[273141]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:39 np0005626463.localdomain sudo[273139]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:40 np0005626463.localdomain sudo[273249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tljwhuherxlwyerzcdxqfpkhhntoooha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839459.7340026-1405-249799634554312/AnsiballZ_file.py
Feb 23 09:37:40 np0005626463.localdomain sudo[273249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:40 np0005626463.localdomain python3.9[273251]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:40 np0005626463.localdomain sudo[273249]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:40 np0005626463.localdomain sudo[273359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igetjtbgvwdozvjorgedxxheqtdinawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839460.3208952-1405-79445822316276/AnsiballZ_file.py
Feb 23 09:37:40 np0005626463.localdomain sudo[273359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:40 np0005626463.localdomain python3.9[273361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:40 np0005626463.localdomain sudo[273359]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:37:40 np0005626463.localdomain podman[273379]: 2026-02-23 09:37:40.909602672 +0000 UTC m=+0.085369253 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 09:37:40 np0005626463.localdomain podman[273379]: 2026-02-23 09:37:40.925327872 +0000 UTC m=+0.101094433 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:37:40 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:37:41 np0005626463.localdomain sudo[273486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcztuyutiejbwheeyofqkflvhfwdltef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839460.8478649-1405-117265615614317/AnsiballZ_file.py
Feb 23 09:37:41 np0005626463.localdomain sudo[273486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:41 np0005626463.localdomain python3.9[273488]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:41 np0005626463.localdomain sudo[273486]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:41 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:41.264 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:41 np0005626463.localdomain sudo[273596]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmbejjptlylnuttnzscofjjesbsapzff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839461.3706553-1405-227484760255741/AnsiballZ_file.py
Feb 23 09:37:41 np0005626463.localdomain sudo[273596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:41 np0005626463.localdomain python3.9[273598]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:41 np0005626463.localdomain sudo[273596]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:42 np0005626463.localdomain sudo[273706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkwxxvmmggmykoxatuarnnefipozmqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839461.9147475-1405-200088561670729/AnsiballZ_file.py
Feb 23 09:37:42 np0005626463.localdomain sudo[273706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:42 np0005626463.localdomain python3.9[273708]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:37:42 np0005626463.localdomain sudo[273706]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:43 np0005626463.localdomain sudo[273816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugzgmpjtjwxxviemebserykfcbdyhiyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839462.977877-1579-150553507700819/AnsiballZ_command.py
Feb 23 09:37:43 np0005626463.localdomain sudo[273816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:43 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:43.343 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:37:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:37:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:37:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:37:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:37:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:37:43 np0005626463.localdomain python3.9[273818]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:43 np0005626463.localdomain sudo[273816]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:43 np0005626463.localdomain sshd[273876]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:37:44 np0005626463.localdomain python3.9[273930]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 23 09:37:45 np0005626463.localdomain sudo[274038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgsxtdeydxocjhhgrwkaxfhffoolsohj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839465.67827-1633-73931212219433/AnsiballZ_systemd_service.py
Feb 23 09:37:45 np0005626463.localdomain sudo[274038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:46 np0005626463.localdomain python3.9[274040]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:37:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:46.294 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:46 np0005626463.localdomain systemd-rc-local-generator[274065]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:37:46 np0005626463.localdomain systemd-sysv-generator[274070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:37:46 np0005626463.localdomain sshd[273876]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:37:46 np0005626463.localdomain sudo[274038]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:37:46 np0005626463.localdomain podman[274094]: 2026-02-23 09:37:46.910333106 +0000 UTC m=+0.087368513 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:37:46 np0005626463.localdomain podman[274094]: 2026-02-23 09:37:46.948196963 +0000 UTC m=+0.125232330 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 23 09:37:46 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:37:47 np0005626463.localdomain sudo[274201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twakavgtvvepvjwluvspruepqsbamolh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839466.9134772-1657-94370850184324/AnsiballZ_command.py
Feb 23 09:37:47 np0005626463.localdomain sudo[274201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:47 np0005626463.localdomain python3.9[274203]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:47 np0005626463.localdomain sudo[274201]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:48 np0005626463.localdomain sudo[274312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddxsjezketbrvvxtwaswtidnghqfqyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839467.9266157-1657-5436117280615/AnsiballZ_command.py
Feb 23 09:37:48 np0005626463.localdomain sudo[274312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:48 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:48.379 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:48 np0005626463.localdomain python3.9[274314]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:48 np0005626463.localdomain sudo[274312]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:37:48.538 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:37:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:37:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:37:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:37:48.540 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:37:48 np0005626463.localdomain sudo[274423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-locpkerinekgzztdsuwddkrnwysmoqrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839468.525687-1657-178881074188053/AnsiballZ_command.py
Feb 23 09:37:48 np0005626463.localdomain sudo[274423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:48 np0005626463.localdomain python3.9[274425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:49 np0005626463.localdomain sudo[274423]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:49 np0005626463.localdomain sudo[274534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdfxmksmkbbkmducmtpwyrhabzawwuwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839469.154536-1657-125062439859341/AnsiballZ_command.py
Feb 23 09:37:49 np0005626463.localdomain sudo[274534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:49 np0005626463.localdomain python3.9[274536]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:49 np0005626463.localdomain sudo[274534]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:50 np0005626463.localdomain sudo[274645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zleczinzikafrbnprskuzxdapoyoidjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839469.8966608-1657-263454204504605/AnsiballZ_command.py
Feb 23 09:37:50 np0005626463.localdomain sudo[274645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:50 np0005626463.localdomain python3.9[274647]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:50 np0005626463.localdomain sudo[274645]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:50 np0005626463.localdomain sudo[274756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjapnibhstafzdtdldxzhowprzuibtpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839470.5165694-1657-83741814126852/AnsiballZ_command.py
Feb 23 09:37:50 np0005626463.localdomain sudo[274756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:50 np0005626463.localdomain python3.9[274758]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:51 np0005626463.localdomain sudo[274756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:51 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:51.327 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:51 np0005626463.localdomain sudo[274867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdvsdodmtwtlgdudszowunqtinsrlycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839471.1283703-1657-36531085673842/AnsiballZ_command.py
Feb 23 09:37:51 np0005626463.localdomain sudo[274867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:51 np0005626463.localdomain python3.9[274869]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:51 np0005626463.localdomain sudo[274867]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:51 np0005626463.localdomain sudo[274978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaehgynbgzwmrgvfinxqgmjysatfrcaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839471.733721-1657-10623601571047/AnsiballZ_command.py
Feb 23 09:37:51 np0005626463.localdomain sudo[274978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:52 np0005626463.localdomain python3.9[274980]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:37:52 np0005626463.localdomain sudo[274978]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:53.412 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:53.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:53 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:53.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:37:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:37:53 np0005626463.localdomain podman[274999]: 2026-02-23 09:37:53.910255149 +0000 UTC m=+0.084176477 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:37:53 np0005626463.localdomain podman[274999]: 2026-02-23 09:37:53.923235703 +0000 UTC m=+0.097157021 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:37:53 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:37:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64118 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC27B00000000001030307) 
Feb 23 09:37:54 np0005626463.localdomain sudo[275110]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxgnsglvqzpyktglnwhorkadwsgcdqzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839474.4997077-1864-252989595754166/AnsiballZ_file.py
Feb 23 09:37:54 np0005626463.localdomain sudo[275110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:54 np0005626463.localdomain python3.9[275112]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:54 np0005626463.localdomain sudo[275110]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64119 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC2BC60000000001030307) 
Feb 23 09:37:55 np0005626463.localdomain sudo[275220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdusvbtpnzyngpmztlvjxubavelervzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839475.0949419-1864-251446329984534/AnsiballZ_file.py
Feb 23 09:37:55 np0005626463.localdomain sudo[275220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:55.574 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:55.574 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:37:55 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:55.592 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:37:55 np0005626463.localdomain python3.9[275222]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:55 np0005626463.localdomain sudo[275220]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29766 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC2E060000000001030307) 
Feb 23 09:37:56 np0005626463.localdomain sshd[275328]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:37:56 np0005626463.localdomain sudo[275331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkekikadctnkrnelnhkkemipzpdisrpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839475.980727-1909-250011657117876/AnsiballZ_file.py
Feb 23 09:37:56 np0005626463.localdomain sudo[275331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:56 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:56.357 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:56 np0005626463.localdomain python3.9[275334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:56 np0005626463.localdomain sudo[275331]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:56 np0005626463.localdomain sshd[275328]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:37:56 np0005626463.localdomain sudo[275442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcjjezhrfcvzheekregcnpaihstprvub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839476.567764-1909-118214278867413/AnsiballZ_file.py
Feb 23 09:37:56 np0005626463.localdomain sudo[275442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:57 np0005626463.localdomain python3.9[275444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:57 np0005626463.localdomain sudo[275442]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64120 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC33C60000000001030307) 
Feb 23 09:37:57 np0005626463.localdomain sudo[275552]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zksdoqmfdtkhuqftycfwjepasdaidjzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839477.6479917-1909-24731744501422/AnsiballZ_file.py
Feb 23 09:37:57 np0005626463.localdomain sudo[275552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:58 np0005626463.localdomain python3.9[275554]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:58 np0005626463.localdomain sudo[275552]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30281 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC38070000000001030307) 
Feb 23 09:37:58 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:58.450 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:37:58 np0005626463.localdomain sudo[275662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfqhmwokgmaeidmkwgvdnvpwtkkimjfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839478.2250023-1909-161001932523288/AnsiballZ_file.py
Feb 23 09:37:58 np0005626463.localdomain sudo[275662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:37:58 np0005626463.localdomain sudo[275665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:37:58 np0005626463.localdomain sudo[275665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:37:58 np0005626463.localdomain sudo[275665]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:59 np0005626463.localdomain python3.9[275664]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:37:59 np0005626463.localdomain sudo[275662]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:59 np0005626463.localdomain sudo[275683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:37:59 np0005626463.localdomain sudo[275683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:37:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:59.559 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:59.559 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:37:59 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:37:59.560 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:37:59 np0005626463.localdomain sudo[275683]: pam_unix(sudo:session): session closed for user root
Feb 23 09:37:59 np0005626463.localdomain sudo[275840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjcbjupdflwfnzbcdfviakjluyatzvre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839479.6293201-1909-182596869678754/AnsiballZ_file.py
Feb 23 09:37:59 np0005626463.localdomain sudo[275840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:00 np0005626463.localdomain python3.9[275842]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:00 np0005626463.localdomain sudo[275840]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:00 np0005626463.localdomain sudo[275950]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adkaysnvmglysnastxwhsbkobqksppig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839480.20855-1909-209335994468904/AnsiballZ_file.py
Feb 23 09:38:00 np0005626463.localdomain sudo[275950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:00 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:00.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:00 np0005626463.localdomain python3.9[275952]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:00 np0005626463.localdomain sudo[275950]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:01 np0005626463.localdomain sudo[276060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iheoyylsniznlvwapqhjxcgkruepcvkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839480.7928998-1909-269751025678478/AnsiballZ_file.py
Feb 23 09:38:01 np0005626463.localdomain sudo[276060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64121 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC43870000000001030307) 
Feb 23 09:38:01 np0005626463.localdomain python3.9[276062]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:01 np0005626463.localdomain sudo[276060]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:01 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:01.399 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:38:02 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:02.756 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:38:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:38:02 np0005626463.localdomain podman[276080]: 2026-02-23 09:38:02.918313634 +0000 UTC m=+0.092481377 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347)
Feb 23 09:38:02 np0005626463.localdomain podman[276080]: 2026-02-23 09:38:02.955857555 +0000 UTC m=+0.130025328 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7)
Feb 23 09:38:02 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:38:03 np0005626463.localdomain sudo[276100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:38:03 np0005626463.localdomain sudo[276100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:38:03 np0005626463.localdomain sudo[276100]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:03 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:03.490 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.661 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.688 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.688 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.689 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.690 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:04 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:04.690 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.555 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.556 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.604 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.604 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:38:05 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.058 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.130 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.131 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.335 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.336 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12171MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.337 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.337 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.400 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.548 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.548 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.549 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.620 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.693 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.693 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.717 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.744 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:38:06 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:06.993 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:38:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:07.440 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:38:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:07.447 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:38:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:07.475 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:38:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:07.478 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:38:07 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:07.478 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:38:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:38:07 np0005626463.localdomain podman[276162]: 2026-02-23 09:38:07.907815412 +0000 UTC m=+0.079809570 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller)
Feb 23 09:38:07 np0005626463.localdomain podman[276162]: 2026-02-23 09:38:07.97827991 +0000 UTC m=+0.150274078 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 23 09:38:07 np0005626463.localdomain podman[276163]: 2026-02-23 09:38:07.991825803 +0000 UTC m=+0.161025734 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:38:07 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:38:07 np0005626463.localdomain podman[276163]: 2026-02-23 09:38:07.999298187 +0000 UTC m=+0.168498148 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:38:08 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:38:08 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:08.492 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:38:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:38:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:38:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1"
Feb 23 09:38:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:38:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1"
Feb 23 09:38:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64122 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC64070000000001030307) 
Feb 23 09:38:09 np0005626463.localdomain sudo[276298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgtxkhcbjcgvhzdcbetfidmnbwpqwahy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839489.1287177-2274-141221957844630/AnsiballZ_getent.py
Feb 23 09:38:09 np0005626463.localdomain sudo[276298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:09 np0005626463.localdomain python3.9[276300]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 23 09:38:09 np0005626463.localdomain sudo[276298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:11 np0005626463.localdomain sshd[276319]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:38:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:38:11 np0005626463.localdomain sshd[276319]: Accepted publickey for zuul from 192.168.122.30 port 49248 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:38:11 np0005626463.localdomain systemd-logind[759]: New session 60 of user zuul.
Feb 23 09:38:11 np0005626463.localdomain systemd[1]: Started Session 60 of User zuul.
Feb 23 09:38:11 np0005626463.localdomain sshd[276319]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:38:11 np0005626463.localdomain podman[276321]: 2026-02-23 09:38:11.266961958 +0000 UTC m=+0.095633174 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:38:11 np0005626463.localdomain podman[276321]: 2026-02-23 09:38:11.307432341 +0000 UTC m=+0.136103527 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:38:11 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:38:11 np0005626463.localdomain sshd[276334]: Received disconnect from 192.168.122.30 port 49248:11: disconnected by user
Feb 23 09:38:11 np0005626463.localdomain sshd[276334]: Disconnected from user zuul 192.168.122.30 port 49248
Feb 23 09:38:11 np0005626463.localdomain sshd[276319]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:38:11 np0005626463.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Feb 23 09:38:11 np0005626463.localdomain systemd-logind[759]: Session 60 logged out. Waiting for processes to exit.
Feb 23 09:38:11 np0005626463.localdomain systemd-logind[759]: Removed session 60.
Feb 23 09:38:11 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:11.440 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:12 np0005626463.localdomain python3.9[276450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:12 np0005626463.localdomain python3.9[276505]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:12 np0005626463.localdomain python3.9[276613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:38:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:38:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:38:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:38:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:38:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:38:13 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:13.494 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:13 np0005626463.localdomain python3.9[276699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839492.5864537-2355-125696160685702/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:14 np0005626463.localdomain python3.9[276808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:14 np0005626463.localdomain sshd[276809]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:38:14 np0005626463.localdomain sshd[276809]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:38:14 np0005626463.localdomain python3.9[276896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839493.672255-2355-194442543401136/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:15 np0005626463.localdomain python3.9[277004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:15 np0005626463.localdomain python3.9[277090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839494.7546844-2355-278485605091449/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:16 np0005626463.localdomain python3.9[277198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:16 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:16.482 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:16 np0005626463.localdomain python3.9[277284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839496.018215-2517-249906156195919/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=3012482a375a6db0cadffa2656b647c3720d54e9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:17 np0005626463.localdomain sudo[277392]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkibbuuhvtczstlgiepslfhibyqnvvge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839497.1710558-2562-75914800359763/AnsiballZ_file.py
Feb 23 09:38:17 np0005626463.localdomain sudo[277392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:38:17 np0005626463.localdomain systemd[1]: tmp-crun.gm4S5U.mount: Deactivated successfully.
Feb 23 09:38:17 np0005626463.localdomain podman[277395]: 2026-02-23 09:38:17.546964565 +0000 UTC m=+0.086525870 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:38:17 np0005626463.localdomain podman[277395]: 2026-02-23 09:38:17.577050584 +0000 UTC m=+0.116611899 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 23 09:38:17 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:38:17 np0005626463.localdomain python3.9[277394]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:17 np0005626463.localdomain sudo[277392]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:18 np0005626463.localdomain sudo[277521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohvcfalapwxeafjxwtlyvxzatjvplrwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839497.8189738-2586-13177297136248/AnsiballZ_copy.py
Feb 23 09:38:18 np0005626463.localdomain sudo[277521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:18 np0005626463.localdomain python3.9[277523]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:18 np0005626463.localdomain sudo[277521]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:18 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:18.530 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:18 np0005626463.localdomain sshd[277557]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:38:19 np0005626463.localdomain sshd[277557]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:38:19 np0005626463.localdomain sudo[277633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suxithfjncaepmuhmwmjntwthrzobpnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839498.5318587-2610-82055480933972/AnsiballZ_stat.py
Feb 23 09:38:19 np0005626463.localdomain sudo[277633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:19.565 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:38:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:19.587 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 09:38:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:19.588 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:19.588 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:19 np0005626463.localdomain python3.9[277635]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:19 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:19.629 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:19 np0005626463.localdomain sudo[277633]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:20 np0005626463.localdomain sudo[277745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feizzwtbkkrwwmswkdnsmszolmomuabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839499.856313-2637-169928617972227/AnsiballZ_file.py
Feb 23 09:38:20 np0005626463.localdomain sudo[277745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:20 np0005626463.localdomain python3.9[277747]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:20 np0005626463.localdomain sudo[277745]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:21 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:21.513 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:21 np0005626463.localdomain python3.9[277855]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:22 np0005626463.localdomain sudo[277965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khtnbmbywuvwttaxjygngvczxjiysftw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839502.2777128-2694-106707065608049/AnsiballZ_file.py
Feb 23 09:38:22 np0005626463.localdomain sudo[277965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:22 np0005626463.localdomain python3.9[277967]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:22 np0005626463.localdomain sudo[277965]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:23 np0005626463.localdomain sudo[278075]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhphsdvihnhjmgmbcqgcsgokxqtilgce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839502.9655595-2718-280812735867352/AnsiballZ_file.py
Feb 23 09:38:23 np0005626463.localdomain sudo[278075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:23 np0005626463.localdomain python3.9[278077]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:23 np0005626463.localdomain sudo[278075]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:23 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:23.569 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23718 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC9CE00000000001030307) 
Feb 23 09:38:24 np0005626463.localdomain python3.9[278185]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:38:24 np0005626463.localdomain podman[278285]: 2026-02-23 09:38:24.908323699 +0000 UTC m=+0.079706419 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:38:24 np0005626463.localdomain podman[278285]: 2026-02-23 09:38:24.918484735 +0000 UTC m=+0.089867425 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:38:24 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:38:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23719 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA1060000000001030307) 
Feb 23 09:38:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64123 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA4060000000001030307) 
Feb 23 09:38:26 np0005626463.localdomain sudo[278512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqsyhwujovswxwbqqtqsepabkfvuajrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839506.0534248-2820-225743860772966/AnsiballZ_container_config_data.py
Feb 23 09:38:26 np0005626463.localdomain sudo[278512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:26 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:26.536 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:26 np0005626463.localdomain python3.9[278514]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 23 09:38:26 np0005626463.localdomain sudo[278512]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23720 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA9060000000001030307) 
Feb 23 09:38:27 np0005626463.localdomain sudo[278622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pefllduyuhwlnkevtpfxvtrtgbeylded ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839507.0988991-2853-89514528043638/AnsiballZ_container_config_hash.py
Feb 23 09:38:27 np0005626463.localdomain sudo[278622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:27 np0005626463.localdomain python3.9[278624]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:38:27 np0005626463.localdomain sudo[278622]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29767 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCAC060000000001030307) 
Feb 23 09:38:28 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:28.620 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:28 np0005626463.localdomain sudo[278732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhnqgwhacqovrtrbalqdkloymnqdyucr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771839508.1894562-2883-144263212466000/AnsiballZ_edpm_container_manage.py
Feb 23 09:38:28 np0005626463.localdomain sudo[278732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:28 np0005626463.localdomain python3[278734]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:38:29 np0005626463.localdomain python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",
                                                                    "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:27:42.035349623Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1216089983,
                                                                    "VirtualSize": 1216089983,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",
                                                                              "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",
                                                                              "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:21.746093163Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:58.628150825Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:01.105956567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:14.411074144Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:16.679986066Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:17.108151361Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:25.131733428Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:29.831104887Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:11.726944348Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.361948209Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.720772563Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.031893078Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.03195279Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:47.46658157Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:38:29 np0005626463.localdomain podman[278784]: 2026-02-23 09:38:29.275360598 +0000 UTC m=+0.068392265 container remove 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=nova_compute_init, managed_by=edpm_ansible)
Feb 23 09:38:29 np0005626463.localdomain python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute_init
Feb 23 09:38:29 np0005626463.localdomain podman[278798]: 
Feb 23 09:38:29 np0005626463.localdomain podman[278798]: 2026-02-23 09:38:29.383198232 +0000 UTC m=+0.088859203 container create 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=nova_compute_init, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0)
Feb 23 09:38:29 np0005626463.localdomain podman[278798]: 2026-02-23 09:38:29.340094487 +0000 UTC m=+0.045755458 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:38:29 np0005626463.localdomain python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 23 09:38:29 np0005626463.localdomain sudo[278732]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:30 np0005626463.localdomain sudo[278944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pezkrzjrplmiadcjdczqihssrwiiqyer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839509.8809001-2907-142095672413028/AnsiballZ_stat.py
Feb 23 09:38:30 np0005626463.localdomain sudo[278944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:30 np0005626463.localdomain python3.9[278946]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:30 np0005626463.localdomain sudo[278944]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23721 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCB8C70000000001030307) 
Feb 23 09:38:31 np0005626463.localdomain python3.9[279056]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:38:31 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:31.566 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:32 np0005626463.localdomain sudo[279164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzqetxybjbiiqkmymvziigyxmtkifzmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839512.207842-2988-172180113568438/AnsiballZ_stat.py
Feb 23 09:38:32 np0005626463.localdomain sudo[279164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:32 np0005626463.localdomain python3.9[279166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:32 np0005626463.localdomain sudo[279164]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:33 np0005626463.localdomain sudo[279254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuumpwmkbgrkofllcyfpetbhuphmlrgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839512.207842-2988-172180113568438/AnsiballZ_copy.py
Feb 23 09:38:33 np0005626463.localdomain sudo[279254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:38:33 np0005626463.localdomain systemd[1]: tmp-crun.SInQ7A.mount: Deactivated successfully.
Feb 23 09:38:33 np0005626463.localdomain podman[279257]: 2026-02-23 09:38:33.153421592 +0000 UTC m=+0.100590770 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:38:33 np0005626463.localdomain podman[279257]: 2026-02-23 09:38:33.193527993 +0000 UTC m=+0.140697101 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.expose-services=)
Feb 23 09:38:33 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:38:33 np0005626463.localdomain python3.9[279256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839512.207842-2988-172180113568438/.source.yaml _original_basename=.tfw04s0h follow=False checksum=f9aa9ce623bd0367523b1516d0fd40e0aad40b65 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:33 np0005626463.localdomain sudo[279254]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:33 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:33.666 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:34 np0005626463.localdomain sudo[279384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwovllmpsobjwkrmewoxrkzbsoptobfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839513.8341827-3039-26924426565520/AnsiballZ_file.py
Feb 23 09:38:34 np0005626463.localdomain sudo[279384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:34 np0005626463.localdomain python3.9[279386]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:34 np0005626463.localdomain sudo[279384]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:34 np0005626463.localdomain sudo[279494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyritatwcghresezhhnrsndkcsnzltxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839514.6687212-3063-90685414217255/AnsiballZ_file.py
Feb 23 09:38:34 np0005626463.localdomain sudo[279494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:35 np0005626463.localdomain python3.9[279496]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 23 09:38:35 np0005626463.localdomain sudo[279494]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:36 np0005626463.localdomain sudo[279604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prizehfotcazgbatxzhcmbprqsuceowg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839515.344682-3087-220327308554728/AnsiballZ_stat.py
Feb 23 09:38:36 np0005626463.localdomain sudo[279604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:36 np0005626463.localdomain python3.9[279606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:36 np0005626463.localdomain sudo[279604]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:36 np0005626463.localdomain sudo[279661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjrvvtslfgzpiixdubdbzrpwiolutrgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839515.344682-3087-220327308554728/AnsiballZ_file.py
Feb 23 09:38:36 np0005626463.localdomain sudo[279661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:36 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:36.614 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:36 np0005626463.localdomain python3.9[279663]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/nova_compute.json _original_basename=.slfkserr recurse=False state=file path=/var/lib/kolla/config_files/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:36 np0005626463.localdomain sudo[279661]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:37 np0005626463.localdomain sshd[279697]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:38:37 np0005626463.localdomain sshd[279697]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:38:37 np0005626463.localdomain python3.9[279773]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:38 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:38.701 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:38:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:38:38 np0005626463.localdomain podman[279898]: 2026-02-23 09:38:38.920063184 +0000 UTC m=+0.081534495 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:38:38 np0005626463.localdomain podman[279898]: 2026-02-23 09:38:38.965328936 +0000 UTC m=+0.126800217 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:38:38 np0005626463.localdomain systemd[1]: tmp-crun.TiUc46.mount: Deactivated successfully.
Feb 23 09:38:38 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:38:38 np0005626463.localdomain podman[279901]: 2026-02-23 09:38:38.987990563 +0000 UTC m=+0.145423648 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:38:39 np0005626463.localdomain podman[279901]: 2026-02-23 09:38:39.001251447 +0000 UTC m=+0.158684542 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:38:39 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:38:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:38:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:38:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:38:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149689 "" "Go-http-client/1.1"
Feb 23 09:38:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:38:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1"
Feb 23 09:38:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23722 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCDA070000000001030307) 
Feb 23 09:38:39 np0005626463.localdomain sudo[280124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdlsxkpdsguryvxsjkjaisgfxpfeclbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839519.6786757-3198-183127164466552/AnsiballZ_container_config_data.py
Feb 23 09:38:39 np0005626463.localdomain sudo[280124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:40 np0005626463.localdomain python3.9[280126]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 23 09:38:40 np0005626463.localdomain sudo[280124]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:40 np0005626463.localdomain sudo[280234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whfuvtogchnlwuuogqhbzkhstiwguslz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839520.6364753-3231-225295295014970/AnsiballZ_container_config_hash.py
Feb 23 09:38:40 np0005626463.localdomain sudo[280234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:41 np0005626463.localdomain python3.9[280236]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 23 09:38:41 np0005626463.localdomain sudo[280234]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:41 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:41.668 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:38:41 np0005626463.localdomain sudo[280355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxcdsmmngmfqmxgxafxsouiclzjzhvlt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771839521.5914857-3261-71405112514734/AnsiballZ_edpm_container_manage.py
Feb 23 09:38:41 np0005626463.localdomain sudo[280355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:41 np0005626463.localdomain systemd[1]: tmp-crun.Ejy2ld.mount: Deactivated successfully.
Feb 23 09:38:41 np0005626463.localdomain podman[280325]: 2026-02-23 09:38:41.921935893 +0000 UTC m=+0.098008489 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:38:41 np0005626463.localdomain podman[280325]: 2026-02-23 09:38:41.958487543 +0000 UTC m=+0.134560179 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 23 09:38:41 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:38:42 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 23 09:38:42 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",
                                                                    "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-02-23T06:27:42.035349623Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.43.0",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260216",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1216089983,
                                                                    "VirtualSize": 1216089983,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",
                                                                              "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",
                                                                              "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",
                                                                              "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",
                                                                              "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.43.0",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260216",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246646992Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:07.246739119Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260216\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-17T01:25:12.132997501Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081651802Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081666472Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081677733Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081688343Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081701553Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.081710413Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:08:39.413481757Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:13.490649497Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.454967918Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:16.773383448Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.106005079Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:17.70903377Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.031262928Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.339397779Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.685304171Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:18.995385131Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.318437706Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.622355571Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:19.942779192Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.272959154Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.574527009Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:20.904983206Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.231560784Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:21.544724487Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:24.726828741Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.052065401Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:25.374537445Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:26.855611087Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628718632Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628779184Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628797064Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:28.628808854Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:09:29.517110337Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:21.746093163Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:11:58.628150825Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:12:01.105956567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:14.411074144Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:16.679986066Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:16:17.108151361Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:25.131733428Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:17:29.831104887Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:26:11.726944348Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:8419493e1fd846703d277695e03fc5eb",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.361948209Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:41.720772563Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.031893078Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:42.03195279Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-02-23T06:27:47.46658157Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"8419493e1fd846703d277695e03fc5eb\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:38:42 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:42.576 231725 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Feb 23 09:38:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:38:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:38:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:38:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:38:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:38:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:38:43 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:43.702 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:46.670 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:46.681 231725 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 23 09:38:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:46.683 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:38:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:46.684 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:38:46 np0005626463.localdomain nova_compute[231721]: 2026-02-23 09:38:46.684 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:38:47 np0005626463.localdomain virtqemud[207530]: End of file while reading data: Input/output error
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Deactivated successfully.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Consumed 18.106s CPU time.
Feb 23 09:38:47 np0005626463.localdomain podman[280415]: 2026-02-23 09:38:47.06498717 +0000 UTC m=+4.555172228 container died 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: tmp-crun.iUVFug.mount: Deactivated successfully.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d-userdata-shm.mount: Deactivated successfully.
Feb 23 09:38:47 np0005626463.localdomain podman[280415]: 2026-02-23 09:38:47.203825291 +0000 UTC m=+4.694010349 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Feb 23 09:38:47 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman stop nova_compute
Feb 23 09:38:47 np0005626463.localdomain podman[280429]: 2026-02-23 09:38:47.21884758 +0000 UTC m=+0.141282549 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.43.0)
Feb 23 09:38:47 np0005626463.localdomain podman[280444]: 2026-02-23 09:38:47.317978482 +0000 UTC m=+0.091675150 container remove 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:38:47 np0005626463.localdomain podman[280450]: Error: no container with name or ID "nova_compute" found: no such container
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a
Feb 23 09:38:47 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Feb 23 09:38:47 np0005626463.localdomain podman[280471]: Error: no container with name or ID "nova_compute" found: no such container
Feb 23 09:38:47 np0005626463.localdomain podman[280472]: 
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Failed with result 'exit-code'.
Feb 23 09:38:47 np0005626463.localdomain podman[280472]: 2026-02-23 09:38:47.415546626 +0000 UTC m=+0.068094265 container create 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260216, container_name=nova_compute, config_id=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:38:47 np0005626463.localdomain podman[280472]: 2026-02-23 09:38:47.37818045 +0000 UTC m=+0.030728069 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 23 09:38:47 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Scheduled restart job, restart counter is at 1.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Started libpod-conmon-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Stopped nova_compute container.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Starting nova_compute container...
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:38:47 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 09:38:47 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 09:38:47 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 09:38:47 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:38:47 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 09:38:47 np0005626463.localdomain podman[280497]: 2026-02-23 09:38:47.570983996 +0000 UTC m=+0.137793991 container init 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:38:47 np0005626463.localdomain podman[280497]: 2026-02-23 09:38:47.584673533 +0000 UTC m=+0.151483518 container start 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + sudo -E kolla_set_configs
Feb 23 09:38:47 np0005626463.localdomain python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman start nova_compute
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: Started nova_compute container.
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Validating config file
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying service configuration files
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Deleting /etc/ceph
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Creating directory /etc/ceph
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Writing out command to execute
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: ++ cat /run_command
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + CMD=nova-compute
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + ARGS=
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + sudo kolla_copy_cacerts
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + [[ ! -n '' ]]
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + . kolla_extend_start
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: Running command: 'nova-compute'
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + umask 0022
Feb 23 09:38:47 np0005626463.localdomain nova_compute[280512]: + exec nova-compute
Feb 23 09:38:47 np0005626463.localdomain sudo[280355]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:47 np0005626463.localdomain podman[280541]: 2026-02-23 09:38:47.735658342 +0000 UTC m=+0.086670965 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:38:47 np0005626463.localdomain podman[280541]: 2026-02-23 09:38:47.817132574 +0000 UTC m=+0.168145217 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:38:47 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:38:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4-merged.mount: Deactivated successfully.
Feb 23 09:38:48 np0005626463.localdomain sudo[280684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sftdumxaddfonczfdzvrjcdbqgqlcpyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839527.9743674-3285-257001990457773/AnsiballZ_stat.py
Feb 23 09:38:48 np0005626463.localdomain sudo[280684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:48 np0005626463.localdomain python3.9[280686]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:48 np0005626463.localdomain sudo[280684]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:38:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:38:48.541 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:38:48.542 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:49 np0005626463.localdomain sudo[280797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfgkhmlughklutjooiiggoilpxuyaxlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839528.7755067-3312-166445684787386/AnsiballZ_file.py
Feb 23 09:38:49 np0005626463.localdomain sudo[280797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:49 np0005626463.localdomain python3.9[280799]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:49 np0005626463.localdomain sudo[280797]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.292 280526 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.293 280526 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.293 280526 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.293 280526 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.412 280526 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.435 280526 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.435 280526 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 09:38:49 np0005626463.localdomain sudo[280856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsqiewycymwdzxnlqebfwnonsohgfoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839528.7755067-3312-166445684787386/AnsiballZ_stat.py
Feb 23 09:38:49 np0005626463.localdomain sudo[280856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:49 np0005626463.localdomain python3.9[280858]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:49 np0005626463.localdomain sudo[280856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.820 280526 INFO nova.virt.driver [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.933 280526 INFO nova.compute.provider_config [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.943 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console_host                   = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:49 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.013 280526 WARNING oslo_config.cfg [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: and ``live_migration_inbound_addr`` respectively.
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: ).  Its value may be silently ignored in the future.
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_secret_uuid        = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.072 280526 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.089 280526 INFO nova.virt.node [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.089 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.101 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f48be99bfa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.103 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f48be99bfa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.104 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Connection event '1' reason 'None'
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.110 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <host>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <uuid>bdcaa433-cfc7-450a-99ab-f0985ab59447</uuid>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <arch>x86_64</arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model>EPYC-Rome-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <vendor>AMD</vendor>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <microcode version='16777317'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <signature family='23' model='49' stepping='0'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='x2apic'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='tsc-deadline'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='osxsave'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='hypervisor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='tsc_adjust'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='spec-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='stibp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='arch-capabilities'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='cmp_legacy'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='topoext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='virt-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='lbrv'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='tsc-scale'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='vmcb-clean'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='pause-filter'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='pfthreshold'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='svme-addr-chk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='rdctl-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='mds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature name='pschange-mc-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <pages unit='KiB' size='4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <pages unit='KiB' size='2048'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <pages unit='KiB' size='1048576'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <power_management>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <suspend_mem/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <suspend_disk/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <suspend_hybrid/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </power_management>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <iommu support='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <migration_features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <live/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <uri_transports>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <uri_transport>tcp</uri_transport>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <uri_transport>rdma</uri_transport>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </uri_transports>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </migration_features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <topology>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <cells num='1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <cell id='0'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <memory unit='KiB'>16116612</memory>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <distances>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <sibling id='0' value='10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           </distances>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           <cpus num='8'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:           </cpus>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         </cell>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </cells>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </topology>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <cache>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </cache>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <secmodel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model>selinux</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <doi>0</doi>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </secmodel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <secmodel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model>dac</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <doi>0</doi>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </secmodel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </host>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <guest>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <os_type>hvm</os_type>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <arch name='i686'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <wordsize>32</wordsize>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <domain type='qemu'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <domain type='kvm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <pae/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <nonpae/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <acpi default='on' toggle='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <apic default='on' toggle='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <cpuselection/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <deviceboot/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <externalSnapshot/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </guest>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <guest>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <os_type>hvm</os_type>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <arch name='x86_64'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <wordsize>64</wordsize>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <domain type='qemu'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <domain type='kvm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <acpi default='on' toggle='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <apic default='on' toggle='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <cpuselection/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <deviceboot/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <externalSnapshot/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </guest>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: </capabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.118 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.123 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: <domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <domain>kvm</domain>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <arch>i686</arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <vcpu max='1024'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <iothreads supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <os supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='firmware'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <loader supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>rom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pflash</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='readonly'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>yes</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='secure'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </loader>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </os>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='maximum' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='maximumMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-model' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <vendor>AMD</vendor>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='x2apic'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='stibp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='succor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lbrv'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='custom' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Dhyana-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <memoryBacking supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='sourceType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>anonymous</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>memfd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </memoryBacking>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <disk supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='diskDevice'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>disk</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cdrom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>floppy</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>lun</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>fdc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>sata</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </disk>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <graphics supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vnc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egl-headless</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </graphics>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <video supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='modelType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vga</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cirrus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>none</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>bochs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ramfb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </video>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hostdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='mode'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>subsystem</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='startupPolicy'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>mandatory</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>requisite</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>optional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='subsysType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pci</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='capsType'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='pciBackend'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hostdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <rng supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>random</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </rng>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <filesystem supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='driverType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>path</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>handle</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtiofs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </filesystem>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tpm supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-tis</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-crb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emulator</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>external</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendVersion'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>2.0</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </tpm>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <redirdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </redirdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <channel supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </channel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <crypto supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </crypto>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <interface supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>passt</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </interface>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <panic supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>isa</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>hyperv</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </panic>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <console supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>null</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dev</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pipe</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stdio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>udp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tcp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu-vdagent</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </console>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <gic supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <vmcoreinfo supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <genid supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backingStoreInput supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backup supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <async-teardown supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <s390-pv supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <ps2 supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tdx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sev supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sgx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hyperv supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='features'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>relaxed</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vapic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>spinlocks</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vpindex</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>runtime</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>synic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stimer</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reset</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vendor_id</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>frequencies</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reenlightenment</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tlbflush</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ipi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>avic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emsr_bitmap</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>xmm_input</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <spinlocks>4095</spinlocks>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <stimer_direct>on</stimer_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hyperv>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <launchSecurity supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: </domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.127 280526 DEBUG nova.virt.libvirt.volume.mount [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.132 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: <domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <domain>kvm</domain>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <arch>i686</arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <vcpu max='240'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <iothreads supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <os supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='firmware'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <loader supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>rom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pflash</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='readonly'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>yes</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='secure'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </loader>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </os>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='maximum' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='maximumMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-model' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <vendor>AMD</vendor>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='x2apic'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='stibp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='succor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lbrv'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='custom' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Dhyana-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <memoryBacking supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='sourceType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>anonymous</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>memfd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </memoryBacking>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <disk supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='diskDevice'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>disk</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cdrom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>floppy</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>lun</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ide</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>fdc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>sata</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </disk>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <graphics supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vnc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egl-headless</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </graphics>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <video supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='modelType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vga</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cirrus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>none</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>bochs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ramfb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </video>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hostdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='mode'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>subsystem</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='startupPolicy'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>mandatory</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>requisite</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>optional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='subsysType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pci</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='capsType'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='pciBackend'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hostdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <rng supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>random</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </rng>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <filesystem supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='driverType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>path</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>handle</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtiofs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </filesystem>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tpm supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-tis</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-crb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emulator</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>external</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendVersion'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>2.0</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </tpm>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <redirdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </redirdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <channel supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </channel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <crypto supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </crypto>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <interface supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>passt</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </interface>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <panic supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>isa</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>hyperv</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </panic>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <console supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>null</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dev</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pipe</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stdio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>udp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tcp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu-vdagent</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </console>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <gic supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <vmcoreinfo supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <genid supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backingStoreInput supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backup supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <async-teardown supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <s390-pv supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <ps2 supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tdx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sev supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sgx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hyperv supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='features'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>relaxed</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vapic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>spinlocks</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vpindex</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>runtime</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>synic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stimer</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reset</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vendor_id</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>frequencies</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reenlightenment</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tlbflush</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ipi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>avic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emsr_bitmap</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>xmm_input</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <spinlocks>4095</spinlocks>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <stimer_direct>on</stimer_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hyperv>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <launchSecurity supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: </domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.184 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.192 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: <domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <domain>kvm</domain>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <arch>x86_64</arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <vcpu max='1024'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <iothreads supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <os supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='firmware'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>efi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <loader supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>rom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pflash</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='readonly'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>yes</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='secure'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>yes</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </loader>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </os>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='maximum' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='maximumMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-model' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <vendor>AMD</vendor>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='x2apic'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='stibp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='succor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lbrv'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='custom' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Dhyana-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <memoryBacking supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='sourceType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>anonymous</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>memfd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </memoryBacking>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <disk supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='diskDevice'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>disk</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cdrom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>floppy</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>lun</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>fdc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>sata</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </disk>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <graphics supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vnc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egl-headless</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </graphics>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <video supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='modelType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vga</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cirrus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>none</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>bochs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ramfb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </video>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hostdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='mode'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>subsystem</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='startupPolicy'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>mandatory</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>requisite</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>optional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='subsysType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pci</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='capsType'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='pciBackend'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hostdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <rng supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>random</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </rng>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <filesystem supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='driverType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>path</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>handle</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtiofs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </filesystem>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tpm supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-tis</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-crb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emulator</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>external</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendVersion'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>2.0</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </tpm>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <redirdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </redirdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <channel supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </channel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <crypto supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </crypto>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <interface supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>passt</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </interface>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <panic supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>isa</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>hyperv</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </panic>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <console supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>null</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dev</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pipe</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stdio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>udp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tcp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu-vdagent</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </console>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <gic supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <vmcoreinfo supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <genid supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backingStoreInput supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backup supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <async-teardown supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <s390-pv supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <ps2 supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tdx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sev supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sgx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hyperv supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='features'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>relaxed</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vapic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>spinlocks</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vpindex</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>runtime</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>synic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stimer</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reset</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vendor_id</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>frequencies</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reenlightenment</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tlbflush</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ipi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>avic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emsr_bitmap</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>xmm_input</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <spinlocks>4095</spinlocks>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <stimer_direct>on</stimer_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hyperv>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <launchSecurity supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: </domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.246 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: <domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <domain>kvm</domain>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <arch>x86_64</arch>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <vcpu max='240'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <iothreads supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <os supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='firmware'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <loader supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>rom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pflash</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='readonly'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>yes</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='secure'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>no</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </loader>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </os>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='maximum' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='maximumMigratable'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>on</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>off</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='host-model' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <vendor>AMD</vendor>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='x2apic'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='stibp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='succor'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lbrv'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <mode name='custom' supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Broadwell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ddpd-u'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sha512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm3'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sm4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Cooperlake-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Denverton-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Dhyana-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amd-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='auto-ibrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibpb-brtype'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='no-nested-data-bp'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='null-sel-clr-base'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='perfmon-v2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbpb'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='stibp-always-on'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='EPYC-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-128'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-256'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx10-512'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='prefetchiti'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Haswell-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='IvyBridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='KnightsMill-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4fmaps'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-4vnniw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512er'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512pf'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fma4'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tbm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xop'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='amx-tile'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-bf16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-fp16'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bitalg'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vbmi2'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrc'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fzrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='la57'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='taa-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='tsx-ldtrk'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='SierraForest-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ifma'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-ne-convert'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx-vnni-int8'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bhi-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='bus-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cmpccxadd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fbsdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='fsrs'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ibrs-all'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='intel-psfd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ipred-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='lam'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mcdt-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pbrsb-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='psdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rfds-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rrsba-ctrl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='serialize'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vaes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='vpclmulqdq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='hle'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='rtm'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512bw'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512cd'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512dq'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512f'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='avx512vl'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='invpcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pcid'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='pku'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='mpx'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v2'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v3'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='core-capability'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='split-lock-detect'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='Snowridge-v4'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='cldemote'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='erms'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='gfni'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdir64b'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='movdiri'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='xsaves'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='athlon-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='core2duo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='coreduo-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='n270-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='ss'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <blockers model='phenom-v1'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnow'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <feature name='3dnowext'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </blockers>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </mode>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </cpu>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <memoryBacking supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <enum name='sourceType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>anonymous</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <value>memfd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </memoryBacking>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <disk supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='diskDevice'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>disk</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cdrom</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>floppy</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>lun</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ide</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>fdc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>sata</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </disk>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <graphics supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vnc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egl-headless</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </graphics>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <video supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='modelType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vga</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>cirrus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>none</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>bochs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ramfb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </video>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hostdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='mode'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>subsystem</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='startupPolicy'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>mandatory</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>requisite</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>optional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='subsysType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pci</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>scsi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='capsType'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='pciBackend'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hostdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <rng supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtio-non-transitional</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>random</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>egd</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </rng>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <filesystem supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='driverType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>path</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>handle</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>virtiofs</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </filesystem>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tpm supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-tis</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tpm-crb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emulator</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>external</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendVersion'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>2.0</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </tpm>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <redirdev supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='bus'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>usb</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </redirdev>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <channel supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </channel>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <crypto supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendModel'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>builtin</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </crypto>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <interface supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='backendType'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>default</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>passt</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </interface>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <panic supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='model'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>isa</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>hyperv</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </panic>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <console supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='type'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>null</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vc</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pty</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dev</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>file</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>pipe</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stdio</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>udp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tcp</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>unix</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>qemu-vdagent</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>dbus</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </console>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </devices>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   <features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <gic supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <vmcoreinfo supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <genid supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backingStoreInput supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <backup supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <async-teardown supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <s390-pv supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <ps2 supported='yes'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <tdx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sev supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <sgx supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <hyperv supported='yes'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <enum name='features'>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>relaxed</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vapic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>spinlocks</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vpindex</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>runtime</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>synic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>stimer</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reset</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>vendor_id</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>frequencies</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>reenlightenment</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>tlbflush</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>ipi</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>avic</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>emsr_bitmap</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <value>xmm_input</value>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </enum>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       <defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <spinlocks>4095</spinlocks>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <stimer_direct>on</stimer_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:       </defaults>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     </hyperv>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:     <launchSecurity supported='no'/>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:   </features>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: </domainCapabilities>
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.297 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.298 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Secure Boot support detected
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.301 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.302 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.317 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.343 280526 INFO nova.virt.node [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.357 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.387 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.392 280526 DEBUG nova.virt.libvirt.vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005626463.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.392 280526 DEBUG nova.network.os_vif_util [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.393 280526 DEBUG nova.network.os_vif_util [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.394 280526 DEBUG os_vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.489 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.493 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.496 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.510 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.511 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.511 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:38:50 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:50.512 280526 INFO oslo.privsep.daemon [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpc7cdv_y2/privsep.sock']
Feb 23 09:38:50 np0005626463.localdomain sudo[280986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-advggfydmrvrdidgzflaltmqlwkbtwic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839530.1622736-3312-130469988177729/AnsiballZ_copy.py
Feb 23 09:38:50 np0005626463.localdomain sudo[280986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:50 np0005626463.localdomain python3.9[280990]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839530.1622736-3312-130469988177729/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:50 np0005626463.localdomain sudo[280986]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.189 280526 INFO oslo.privsep.daemon [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.063 280993 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.068 280993 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.071 280993 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.072 280993 INFO oslo.privsep.daemon [-] privsep daemon running as pid 280993
Feb 23 09:38:51 np0005626463.localdomain sudo[281049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krtqpjcgpyboriyussqzffsniismiueh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839530.1622736-3312-130469988177729/AnsiballZ_systemd.py
Feb 23 09:38:51 np0005626463.localdomain sudo[281049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.441 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.442 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.443 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.444 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.445 280526 INFO os_vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.445 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.450 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.450 280526 INFO nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.525 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.525 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.526 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.526 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.527 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:38:51 np0005626463.localdomain python3.9[281051]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:38:51 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:51.709 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:51 np0005626463.localdomain sudo[281049]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:51 np0005626463.localdomain sshd[281091]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.027 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.091 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.091 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.308 280526 WARNING nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.310 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12192MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.310 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.311 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.448 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.448 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.449 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.496 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.517 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.517 280526 DEBUG nova.compute.provider_tree [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.535 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.556 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:38:52 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:52.592 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.053 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.059 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.059 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] kernel doesn't support AMD SEV
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.061 280526 DEBUG nova.compute.provider_tree [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.062 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.084 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.111 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.112 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.112 280526 DEBUG nova.service [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.139 280526 DEBUG nova.service [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 09:38:53 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:53.140 280526 DEBUG nova.servicegroup.drivers.db [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = <Service: host=np0005626463.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 09:38:53 np0005626463.localdomain python3.9[281207]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 23 09:38:54 np0005626463.localdomain sshd[281091]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:38:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57420 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD120F0000000001030307) 
Feb 23 09:38:54 np0005626463.localdomain sudo[281315]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvdjyazikhhmwvpecjnoomzjjrnmvcit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839534.3600266-3435-280990999344300/AnsiballZ_stat.py
Feb 23 09:38:54 np0005626463.localdomain sudo[281315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:54 np0005626463.localdomain python3.9[281317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 23 09:38:54 np0005626463.localdomain sudo[281315]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57421 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD16060000000001030307) 
Feb 23 09:38:55 np0005626463.localdomain sudo[281405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcuswzuvoayknkjxkdjgygdmgvlcgmws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839534.3600266-3435-280990999344300/AnsiballZ_copy.py
Feb 23 09:38:55 np0005626463.localdomain sudo[281405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:38:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:38:55 np0005626463.localdomain podman[281408]: 2026-02-23 09:38:55.260479466 +0000 UTC m=+0.087268795 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:38:55 np0005626463.localdomain podman[281408]: 2026-02-23 09:38:55.274281465 +0000 UTC m=+0.101070784 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:38:55 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:38:55 np0005626463.localdomain python3.9[281407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839534.3600266-3435-280990999344300/.source.yaml _original_basename=.h_bgzpl6 follow=False checksum=4185f12b535f7417c8eab31aeeb8094a78600762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:38:55 np0005626463.localdomain sudo[281405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:38:55 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:55.522 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.132 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23723 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD1A060000000001030307) 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.137 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efddab9c-4c17-4c4f-ab69-f617ce278b85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.133811', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79027796-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'b0ab021e1ec2ae764e04f5b1bca6022742a3a3043f6f0e3d6efcba988dff7345'}]}, 'timestamp': '2026-02-23 09:38:56.137973', '_unique_id': 'e894ffb1be2f4e9ba8e42b66eed85c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.140 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.141 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511ec253-929e-4ce5-904b-34c38c37a49f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9662, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.141189', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79030c6a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '06e696cec07968443c4857ce6128b3f5a04b0bed4103b1c9a9abcb57601add4f'}]}, 'timestamp': '2026-02-23 09:38:56.141664', '_unique_id': '9f36e6ca9e494ff7be73a879419784ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39f01e91-43c5-4201-9c4d-d46da94bf8f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.143906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790906f6-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '1ccce793b329504a41ab8727b527d66253fb2b574fe5f601f501e992401ac678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.143906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79091c72-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'a75fb76b25d0a86c472073f6f45131f0b5292310728e052c0f1c4b44bcc70bae'}]}, 'timestamp': '2026-02-23 09:38:56.181377', '_unique_id': '4d5b2d7dee174fa7b919036c42ab0d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fab113f-9e64-4927-b6e3-a5a3d737365d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.184153', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79099b3e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '948efb18233c794d576f5092b1665ccfc4aa047bc8b01cfffb953334892d7424'}]}, 'timestamp': '2026-02-23 09:38:56.184641', '_unique_id': '8426793022e34c56983764bb19535bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7854faaa-c32c-4e0c-aa87-e77b7857bafb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.187330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790a1712-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'af8e89a001952766683a83fc4089d4862a8d9a83869be6a0bede107a08813c6d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.187330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790a2928-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'e55ef8a1302a6f9f24650bcc423e4677bcaf0300f9bbc79937c834a4e9f5046f'}]}, 'timestamp': '2026-02-23 09:38:56.188299', '_unique_id': 'a463e0e5eba145a7b9aa526ef461a043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.191 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2bc3ca-2d42-4af9-a72f-af0bd76f1dc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.190793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790aa006-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '02711a052d2aaf8b14df220a3c54b4741987efdf009bbbfe51e73c30516b51f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.190793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790ab1e0-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '40bdea69a80c696e4dfa96f09aaf886a823048681014f89a5e815ae598eeb622'}]}, 'timestamp': '2026-02-23 09:38:56.191757', '_unique_id': '4837a049601948918b5b5b887ca2ecfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74a0e0cf-a670-4a46-bd95-0365a98f4839', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.195399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790d40a4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': 'cad3c6c13edd14ac17e3a6506c617f32fea3dbb36bb5c6a08ee80ecf75221976'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.195399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790d522e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': 'adbe40316c205c6403f06f28a6eb6cb1e50c1d8711d8edc4e98ba27b8f5e6700'}]}, 'timestamp': '2026-02-23 09:38:56.208999', '_unique_id': '42399bf38b8648d29bd1b5d66a39345b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.211 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 57690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '695c3b76-de03-40d9-b246-f6c41322010a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57690000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:38:56.211345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7910d39a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.420805905, 'message_signature': 'c136ca2c6a36d7adef824c7f10e3988a9a8abb0a15bfa801dc8ae622e687d836'}]}, 'timestamp': '2026-02-23 09:38:56.232021', '_unique_id': '6ca4e688f18946b3affbe370d76f8262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d7ff8a-a343-4a7a-be59-0b01ff48b764', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.235021', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7911847a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'e5c6ba4e0d5fb81403b1b56d7d9fcde550c420e5048103c40164807489d37bb1'}]}, 'timestamp': '2026-02-23 09:38:56.236610', '_unique_id': '90f957af34734f3ca30cbdff386f971f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65d0621f-56ed-42b1-9db8-c6e71de1a74f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.239766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79121980-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '0c272eff581eaf41d48e29784df155e6c9eb0c3fd18432f270591c6b57f2784a'}]}, 'timestamp': '2026-02-23 09:38:56.240308', '_unique_id': 'cf9ce7374e094c59b3f4a5dbc33ef5b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2233a0c2-36b4-4d7c-b402-2687ec1b21d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.242509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '79128230-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '5a4b9b91d704146817fb0c7b1676a86edf6611f8cd78ae747d607eca4b40e5a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.242509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7912954a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '2b60673eedf69fea8483cc521278372f3c586b6c1b68749b73e19639c607b3e7'}]}, 'timestamp': '2026-02-23 09:38:56.243442', '_unique_id': 'eaf746eaed5243dfb2f9bb6cb5e02503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8689e10c-d450-4b10-94b8-c646d0993c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.245674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7912feb8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'f8b49c146d1fd2c729a2bd95fd5fbeea0a7c29b56890f21dac8e9d484826264a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.245674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79130f52-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '94bb849f413b01337f2209bd83c27e86438c20308f07b256c2bd4fb16a0c76eb'}]}, 'timestamp': '2026-02-23 09:38:56.246560', '_unique_id': '9990de3a6744486a82e3db31f7c3b4c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5484fe47-9f26-4214-a341-9005e0d5158e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.248784', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7913788e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '7becb80498e0b68928d1cf0cafa8f5af23c555f59e18ccadc7914f2fe13a6859'}]}, 'timestamp': '2026-02-23 09:38:56.249284', '_unique_id': 'd38b7029a3144e34937cd9bf5572dc58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 92 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30d0d94-a1a8-44a7-be3f-7c32a7158d7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 92, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.251592', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7913e486-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '4f093aaa89a051055c1b9a6b4e557a334db0ddb1c5dd6251acd3e93682bb3ee1'}]}, 'timestamp': '2026-02-23 09:38:56.252080', '_unique_id': '1d8c888f00c34cf596fe80d8fdee4293'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9508448-c634-49a6-94f5-70589cf47ac2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.254221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '79144b56-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '1b60d3f9b78f7701330298ac19b6dd51461514c9b6e26de11b0444eceb5b148a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.254221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79145d4e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '1a8bd01dcc55e9e2f2ce3c5c7b4801d870a47b83c8a5dae028be78fb65cfba3b'}]}, 'timestamp': '2026-02-23 09:38:56.255116', '_unique_id': '9e1ff4cdcb00447e97499b5cdfa9a529'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '752d258c-5281-4ae1-98c0-e96dffda710e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.257296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7914c388-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '3068665d16d6e88f332638fc2b4b18d356dc75d809d4b05b1780cc1fdd792c7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.257296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7914d512-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'd54e0e8ab4fc501daf60de18fad0c8e998dda6d5f27996051595f54472427a97'}]}, 'timestamp': '2026-02-23 09:38:56.258236', '_unique_id': '53ad012b92af4fc2ac818ba34506a439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c51393ce-7992-42fb-9fd3-6a007e64a3d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:38:56.260447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '79153ebc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.420805905, 'message_signature': '9f4879bccf9264921efee80beb7361a94e05365865bcd2275fc25029cf625cba'}]}, 'timestamp': '2026-02-23 09:38:56.260928', '_unique_id': 'f599fd60021947568f57e6987525903b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b987e48-1684-44d2-b72c-e418c1da0899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.263081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7915a56e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'bb970f494c9a2a7a31bc793ba8906e76f291ceef48fa5a6e969710b3f6096526'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.263081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7915b5c2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '7edb54985b6773a651d6cc82831ff7cb504cc7b9cbe5429c75abe3f0de005198'}]}, 'timestamp': '2026-02-23 09:38:56.263959', '_unique_id': 'b2421ff1a75e4af0b367812bc800200f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97cdaff5-a1bb-4cd5-a433-867f9037cc4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.266142', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79161d32-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '5bb0ed5ffa5be6b83519ce0d4966e2fc40a329dd35939af26025fb6369a4e7ba'}]}, 'timestamp': '2026-02-23 09:38:56.266603', '_unique_id': '806a0785a80c40749c1f20b8b4f4e3dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f43810-d966-4c22-b6c3-5cd450d38bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.268766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '791684d4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'ceefa987dfa5af62cfe815cb3803c9daa1846de97fd73d339262879aa00ad697'}]}, 'timestamp': '2026-02-23 09:38:56.269260', '_unique_id': '0201d35d6b234b4fa09920c54f77e86c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db0b8c95-3d98-473c-9966-332d09277754', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.271466', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7916ed3e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'd45b069e669a9a710c9a53f778c68b0f9bb35ece266d55b9373dfeff239ce2df'}]}, 'timestamp': '2026-02-23 09:38:56.271965', '_unique_id': '99f6959aa9924bf6899ac337dbd9e522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:38:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:38:56 np0005626463.localdomain python3.9[281537]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:56 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:38:56.734 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:38:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57422 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD1E070000000001030307) 
Feb 23 09:38:57 np0005626463.localdomain python3.9[281645]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:38:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64124 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD22060000000001030307) 
Feb 23 09:38:58 np0005626463.localdomain python3.9[281753]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 23 09:39:00 np0005626463.localdomain sudo[281861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yghcllptrbcnmmfioqgijozocsebnuxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839539.8819938-3585-223350105172929/AnsiballZ_podman_container.py
Feb 23 09:39:00 np0005626463.localdomain sudo[281861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:39:00 np0005626463.localdomain python3.9[281863]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 09:39:00 np0005626463.localdomain sudo[281861]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:00 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 120.4 (401 of 333 items), suggesting rotation.
Feb 23 09:39:00 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:39:00 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:39:00 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:00.566 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:00 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:39:01 np0005626463.localdomain sudo[281996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwhicqbwhljksclzqhhifcuxnndeiuql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839540.7703652-3609-24016980640401/AnsiballZ_systemd.py
Feb 23 09:39:01 np0005626463.localdomain sudo[281996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:39:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57423 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD2DC60000000001030307) 
Feb 23 09:39:01 np0005626463.localdomain python3.9[281998]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 23 09:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:39:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:39:01 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:01.787 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:02 np0005626463.localdomain systemd[1]: Stopping nova_compute container...
Feb 23 09:39:02 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:02.492 280526 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Feb 23 09:39:03 np0005626463.localdomain sudo[282016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:39:03 np0005626463.localdomain sudo[282016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:39:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:39:03 np0005626463.localdomain sudo[282016]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:03 np0005626463.localdomain sudo[282035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:39:03 np0005626463.localdomain sudo[282035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:39:03 np0005626463.localdomain podman[282034]: 2026-02-23 09:39:03.715652203 +0000 UTC m=+0.088817382 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.7)
Feb 23 09:39:03 np0005626463.localdomain podman[282034]: 2026-02-23 09:39:03.7290309 +0000 UTC m=+0.102196089 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347)
Feb 23 09:39:03 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:39:04 np0005626463.localdomain sudo[282035]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:04 np0005626463.localdomain sudo[282105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:39:04 np0005626463.localdomain sudo[282105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:39:04 np0005626463.localdomain sudo[282105]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:04 np0005626463.localdomain sudo[282123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:39:04 np0005626463.localdomain sudo[282123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:39:05 np0005626463.localdomain sudo[282123]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:05 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:05.603 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:39:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:39:06 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:06.371 280526 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 23 09:39:06 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:06.373 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:39:06 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:06.374 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:39:06 np0005626463.localdomain nova_compute[280512]: 2026-02-23 09:39:06.374 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:39:06 np0005626463.localdomain virtqemud[207530]: End of file while reading data: Input/output error
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: libpod-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Deactivated successfully.
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: libpod-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Consumed 4.815s CPU time.
Feb 23 09:39:06 np0005626463.localdomain podman[282002]: 2026-02-23 09:39:06.801732229 +0000 UTC m=+4.378312432 container died 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b-userdata-shm.mount: Deactivated successfully.
Feb 23 09:39:06 np0005626463.localdomain podman[282002]: 2026-02-23 09:39:06.872506068 +0000 UTC m=+4.449086281 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:39:06 np0005626463.localdomain podman[282002]: nova_compute
Feb 23 09:39:06 np0005626463.localdomain podman[282160]: 2026-02-23 09:39:06.886616038 +0000 UTC m=+0.078509050 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: libpod-conmon-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Deactivated successfully.
Feb 23 09:39:06 np0005626463.localdomain podman[282188]: error opening file `/run/crun/2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b/status`: No such file or directory
Feb 23 09:39:06 np0005626463.localdomain podman[282177]: 2026-02-23 09:39:06.988815156 +0000 UTC m=+0.067323181 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:39:06 np0005626463.localdomain podman[282177]: nova_compute
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 23 09:39:06 np0005626463.localdomain systemd[1]: Stopped nova_compute container.
Feb 23 09:39:07 np0005626463.localdomain systemd[1]: Starting nova_compute container...
Feb 23 09:39:07 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:39:07 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:07 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:07 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:07 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:07 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:07 np0005626463.localdomain podman[282192]: 2026-02-23 09:39:07.148760186 +0000 UTC m=+0.119400696 container init 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:39:07 np0005626463.localdomain podman[282192]: 2026-02-23 09:39:07.157492218 +0000 UTC m=+0.128132728 container start 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:39:07 np0005626463.localdomain podman[282192]: nova_compute
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + sudo -E kolla_set_configs
Feb 23 09:39:07 np0005626463.localdomain systemd[1]: Started nova_compute container.
Feb 23 09:39:07 np0005626463.localdomain sudo[281996]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Validating config file
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying service configuration files
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /etc/ceph
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Creating directory /etc/ceph
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Writing out command to execute
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: ++ cat /run_command
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + CMD=nova-compute
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + ARGS=
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + sudo kolla_copy_cacerts
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + [[ ! -n '' ]]
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + . kolla_extend_start
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: Running command: 'nova-compute'
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + echo 'Running command: '\''nova-compute'\'''
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + umask 0022
Feb 23 09:39:07 np0005626463.localdomain nova_compute[282206]: + exec nova-compute
Feb 23 09:39:07 np0005626463.localdomain sudo[282326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izylksfcmojllmbuimaxllxvbxrufils ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771839547.4223402-3636-192869072712863/AnsiballZ_podman_container.py
Feb 23 09:39:07 np0005626463.localdomain sudo[282326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 23 09:39:07 np0005626463.localdomain python3.9[282328]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: Started libpod-conmon-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope.
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:39:08 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:08 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:08 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 23 09:39:08 np0005626463.localdomain podman[282353]: 2026-02-23 09:39:08.227142658 +0000 UTC m=+0.144002243 container init 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=nova_compute_init, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:39:08 np0005626463.localdomain podman[282353]: 2026-02-23 09:39:08.238835693 +0000 UTC m=+0.155695278 container start 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.43.0, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:39:08 np0005626463.localdomain python3.9[282328]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 23 09:39:08 np0005626463.localdomain sudo[282368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:39:08 np0005626463.localdomain sudo[282368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:39:08 np0005626463.localdomain sudo[282368]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Applying nova statedir ownership
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/console.log
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/b81db1e2a8e54083d8c4b030cc59287a706969ae
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-b81db1e2a8e54083d8c4b030cc59287a706969ae
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/f23138a46bc477ec40b895db4322b27384fbb01ccd8da7395c9877132dfb82af
Feb 23 09:39:08 np0005626463.localdomain nova_compute_init[282391]: INFO:nova_statedir:Nova statedir ownership complete
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: libpod-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope: Deactivated successfully.
Feb 23 09:39:08 np0005626463.localdomain podman[282392]: 2026-02-23 09:39:08.312463711 +0000 UTC m=+0.054979057 container died 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:39:08 np0005626463.localdomain podman[282404]: 2026-02-23 09:39:08.391367122 +0000 UTC m=+0.075510207 container cleanup 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: libpod-conmon-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope: Deactivated successfully.
Feb 23 09:39:08 np0005626463.localdomain sudo[282326]: pam_unix(sudo:session): session closed for user root
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4-merged.mount: Deactivated successfully.
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e-userdata-shm.mount: Deactivated successfully.
Feb 23 09:39:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:08.877 282211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:39:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:08.878 282211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:39:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:08.878 282211 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 23 09:39:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:08.879 282211 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 23 09:39:08 np0005626463.localdomain sshd[265939]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Feb 23 09:39:08 np0005626463.localdomain systemd[1]: session-59.scope: Consumed 1min 22.025s CPU time.
Feb 23 09:39:08 np0005626463.localdomain systemd-logind[759]: Session 59 logged out. Waiting for processes to exit.
Feb 23 09:39:08 np0005626463.localdomain systemd-logind[759]: Removed session 59.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.024 282211 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.046 282211 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.046 282211 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 23 09:39:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:39:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:39:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:39:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.429 282211 INFO nova.virt.driver [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 23 09:39:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:39:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1"
Feb 23 09:39:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57424 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD4E060000000001030307) 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.541 282211 INFO nova.compute.provider_config [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console_host                   = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] host                           = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] my_block_storage_ip            = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] my_ip                          = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 WARNING oslo_config.cfg [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: and ``live_migration_inbound_addr`` respectively.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: ).  Its value may be silently ignored in the future.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_secret_uuid        = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.623 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.624 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.626 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.626 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.643 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.643 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.646 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.646 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.661 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.661 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.662 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.662 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.671 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.671 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.722 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.722 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.722 282211 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.749 282211 INFO nova.virt.node [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.750 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.751 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.752 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.752 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.763 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f454eaea5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.766 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f454eaea5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.768 282211 INFO nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Connection event '1' reason 'None'
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.771 282211 INFO nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host capabilities <capabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <host>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <uuid>bdcaa433-cfc7-450a-99ab-f0985ab59447</uuid>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <arch>x86_64</arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model>EPYC-Rome-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <vendor>AMD</vendor>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <microcode version='16777317'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <signature family='23' model='49' stepping='0'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='x2apic'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='tsc-deadline'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='osxsave'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='hypervisor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='tsc_adjust'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='spec-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='stibp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='arch-capabilities'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='cmp_legacy'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='topoext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='virt-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='lbrv'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='tsc-scale'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='vmcb-clean'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='pause-filter'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='pfthreshold'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='svme-addr-chk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='rdctl-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='skip-l1dfl-vmentry'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='mds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature name='pschange-mc-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <pages unit='KiB' size='4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <pages unit='KiB' size='2048'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <pages unit='KiB' size='1048576'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <power_management>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <suspend_mem/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <suspend_disk/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <suspend_hybrid/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </power_management>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <iommu support='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <migration_features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <live/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <uri_transports>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <uri_transport>tcp</uri_transport>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <uri_transport>rdma</uri_transport>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </uri_transports>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </migration_features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <topology>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <cells num='1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <cell id='0'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <memory unit='KiB'>16116612</memory>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <pages unit='KiB' size='2048'>0</pages>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <distances>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <sibling id='0' value='10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           </distances>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           <cpus num='8'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:           </cpus>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         </cell>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </cells>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </topology>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <cache>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </cache>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <secmodel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model>selinux</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <doi>0</doi>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </secmodel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <secmodel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model>dac</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <doi>0</doi>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </secmodel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </host>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <guest>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <os_type>hvm</os_type>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <arch name='i686'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <wordsize>32</wordsize>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <domain type='qemu'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <domain type='kvm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <pae/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <nonpae/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <acpi default='on' toggle='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <apic default='on' toggle='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <cpuselection/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <deviceboot/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <externalSnapshot/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </guest>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <guest>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <os_type>hvm</os_type>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <arch name='x86_64'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <wordsize>64</wordsize>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <domain type='qemu'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <domain type='kvm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <acpi default='on' toggle='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <apic default='on' toggle='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <cpuselection/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <deviceboot/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <disksnapshot default='on' toggle='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <externalSnapshot/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </guest>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: </capabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.783 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.786 282211 DEBUG nova.virt.libvirt.volume.mount [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.788 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: <domainCapabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <domain>kvm</domain>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <arch>i686</arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <vcpu max='240'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <iothreads supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <os supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <enum name='firmware'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <loader supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>rom</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pflash</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='readonly'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>yes</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='secure'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </loader>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </os>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='maximum' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='maximumMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-model' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <vendor>AMD</vendor>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='x2apic'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='stibp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='succor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lbrv'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='custom' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Dhyana-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <memoryBacking supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <enum name='sourceType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>file</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>anonymous</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>memfd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </memoryBacking>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <devices>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <disk supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='diskDevice'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>disk</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>cdrom</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>floppy</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>lun</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>ide</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>fdc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>sata</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <graphics supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vnc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>egl-headless</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </graphics>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <video supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='modelType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vga</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>cirrus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>none</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>bochs</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>ramfb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </video>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <hostdev supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='mode'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>subsystem</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='startupPolicy'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>mandatory</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>requisite</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>optional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='subsysType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pci</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='capsType'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='pciBackend'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </hostdev>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <rng supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>random</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>egd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </rng>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <filesystem supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='driverType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>path</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>handle</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtiofs</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </filesystem>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <tpm supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tpm-tis</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tpm-crb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>emulator</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>external</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendVersion'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>2.0</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </tpm>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <redirdev supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </redirdev>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <channel supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </channel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <crypto supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>qemu</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </crypto>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <interface supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>passt</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </interface>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <panic supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>isa</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>hyperv</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </panic>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <console supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>null</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dev</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>file</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pipe</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>stdio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>udp</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tcp</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>qemu-vdagent</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </console>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </devices>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <gic supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <vmcoreinfo supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <genid supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <backingStoreInput supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <backup supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <async-teardown supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <s390-pv supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <ps2 supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <tdx supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <sev supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <sgx supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <hyperv supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='features'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>relaxed</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vapic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>spinlocks</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vpindex</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>runtime</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>synic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>stimer</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>reset</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vendor_id</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>frequencies</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>reenlightenment</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tlbflush</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>ipi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>avic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>emsr_bitmap</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>xmm_input</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <defaults>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <spinlocks>4095</spinlocks>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <stimer_direct>on</stimer_direct>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </defaults>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </hyperv>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <launchSecurity supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: </domainCapabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.799 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: <domainCapabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <domain>kvm</domain>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <arch>i686</arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <vcpu max='1024'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <iothreads supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <os supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <enum name='firmware'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <loader supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>rom</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pflash</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='readonly'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>yes</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='secure'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </loader>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </os>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='maximum' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='maximumMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-model' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <vendor>AMD</vendor>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='x2apic'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='stibp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='succor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lbrv'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='custom' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Dhyana-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain systemd[1]: tmp-crun.rA8lPC.mount: Deactivated successfully.
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain podman[282472]: 2026-02-23 09:39:09.921423845 +0000 UTC m=+0.101880739 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <memoryBacking supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <enum name='sourceType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>file</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>anonymous</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>memfd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </memoryBacking>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <devices>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <disk supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='diskDevice'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>disk</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>cdrom</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>floppy</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>lun</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>fdc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>sata</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <graphics supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vnc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>egl-headless</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </graphics>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <video supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='modelType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vga</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>cirrus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>none</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>bochs</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>ramfb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </video>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <hostdev supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='mode'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>subsystem</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='startupPolicy'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>mandatory</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>requisite</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>optional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='subsysType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pci</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='capsType'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='pciBackend'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </hostdev>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <rng supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>random</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>egd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </rng>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <filesystem supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='driverType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>path</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>handle</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>virtiofs</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </filesystem>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <tpm supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tpm-tis</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tpm-crb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>emulator</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>external</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendVersion'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>2.0</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </tpm>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <redirdev supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </redirdev>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <channel supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </channel>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <crypto supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>qemu</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </crypto>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <interface supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='backendType'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>passt</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </interface>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <panic supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>isa</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>hyperv</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </panic>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <console supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>null</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vc</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dev</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>file</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pipe</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>stdio</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>udp</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tcp</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>qemu-vdagent</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </console>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </devices>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <gic supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <vmcoreinfo supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <genid supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <backingStoreInput supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <backup supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <async-teardown supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <s390-pv supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <ps2 supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <tdx supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <sev supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <sgx supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <hyperv supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='features'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>relaxed</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vapic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>spinlocks</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vpindex</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>runtime</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>synic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>stimer</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>reset</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>vendor_id</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>frequencies</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>reenlightenment</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>tlbflush</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>ipi</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>avic</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>emsr_bitmap</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>xmm_input</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <defaults>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <spinlocks>4095</spinlocks>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <stimer_direct>on</stimer_direct>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </defaults>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </hyperv>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <launchSecurity supported='no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </features>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: </domainCapabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.865 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.872 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]: <domainCapabilities>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <domain>kvm</domain>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <arch>x86_64</arch>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <vcpu max='240'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <iothreads supported='yes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <os supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <enum name='firmware'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <loader supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>rom</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>pflash</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='readonly'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>yes</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='secure'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </loader>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   </os>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:   <cpu>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='maximum' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <enum name='maximumMigratable'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='host-model' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <vendor>AMD</vendor>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='x2apic'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='stibp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='succor'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lbrv'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:     <mode name='custom' supported='yes'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Dhyana-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain podman[282473]: 2026-02-23 09:39:09.994612008 +0000 UTC m=+0.170389727 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge'>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:39:09 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </cpu>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <memoryBacking supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <enum name='sourceType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>file</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>anonymous</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>memfd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </memoryBacking>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <devices>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <disk supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='diskDevice'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>disk</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>cdrom</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>floppy</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>lun</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>ide</value>
Feb 23 09:39:10 np0005626463.localdomain podman[282473]: 2026-02-23 09:39:10.006133778 +0000 UTC m=+0.181911507 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>fdc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>sata</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <graphics supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vnc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>egl-headless</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </graphics>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <video supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='modelType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vga</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>cirrus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>none</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>bochs</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>ramfb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </video>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <hostdev supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='mode'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>subsystem</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='startupPolicy'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>mandatory</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>requisite</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>optional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='subsysType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pci</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='capsType'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='pciBackend'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </hostdev>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <rng supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>random</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>egd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </rng>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <filesystem supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='driverType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>path</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>handle</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtiofs</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </filesystem>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <tpm supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tpm-tis</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tpm-crb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>emulator</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>external</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendVersion'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>2.0</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </tpm>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <redirdev supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </redirdev>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <channel supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </channel>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <crypto supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>qemu</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </crypto>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <interface supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>passt</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </interface>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <panic supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>isa</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>hyperv</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </panic>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <console supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>null</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dev</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>file</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pipe</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>stdio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>udp</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tcp</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>qemu-vdagent</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </console>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </devices>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <features>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <gic supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <vmcoreinfo supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <genid supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <backingStoreInput supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <backup supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <async-teardown supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <s390-pv supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <ps2 supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <tdx supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <sev supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <sgx supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <hyperv supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='features'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>relaxed</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vapic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>spinlocks</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vpindex</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>runtime</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>synic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>stimer</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>reset</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vendor_id</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>frequencies</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>reenlightenment</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tlbflush</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>ipi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>avic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>emsr_bitmap</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>xmm_input</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <defaults>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <spinlocks>4095</spinlocks>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <stimer_direct>on</stimer_direct>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </defaults>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </hyperv>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <launchSecurity supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </features>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: </domainCapabilities>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:09.945 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: <domainCapabilities>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <path>/usr/libexec/qemu-kvm</path>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <domain>kvm</domain>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <arch>x86_64</arch>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <vcpu max='1024'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <iothreads supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <os supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <enum name='firmware'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>efi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <loader supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>rom</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pflash</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='readonly'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>yes</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='secure'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>yes</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>no</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </loader>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </os>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <cpu>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <mode name='host-passthrough' supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='hostPassthroughMigratable'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <mode name='maximum' supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='maximumMigratable'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>on</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>off</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <mode name='host-model' supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <vendor>AMD</vendor>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='x2apic'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-deadline'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='hypervisor'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc_adjust'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='spec-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='stibp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ssbd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='cmp_legacy'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='overflow-recov'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='succor'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='amd-ssbd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='virt-ssbd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lbrv'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='tsc-scale'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='vmcb-clean'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pause-filter'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='pfthreshold'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='svme-addr-chk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <feature policy='disable' name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <mode name='custom' supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Broadwell-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cascadelake-Server-v5'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='ClearwaterForest-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ddpd-u'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sha512'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sm3'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sm4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Cooperlake-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Denverton-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Dhyana-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Genoa-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Milan-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Rome-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-Turin-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amd-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='auto-ibrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vp2intersect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fs-gs-base-ns'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibpb-brtype'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='no-nested-data-bp'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='null-sel-clr-base'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='perfmon-v2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbpb'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='srso-user-kernel-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='stibp-always-on'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='EPYC-v5'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='GraniteRapids-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-128'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-256'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx10-512'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='prefetchiti'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Haswell-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-noTSX'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v5'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v6'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain podman[282472]: 2026-02-23 09:39:10.047273031 +0000 UTC m=+0.227729965 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Icelake-Server-v7'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='IvyBridge-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='KnightsMill-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4fmaps'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-4vnniw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512er'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512pf'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G4-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Opteron_G5-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fma4'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tbm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xop'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SapphireRapids-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='amx-tile'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-bf16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-fp16'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512-vpopcntdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bitalg'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vbmi2'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrc'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fzrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='la57'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='taa-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='tsx-ldtrk'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='SierraForest-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ifma'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-ne-convert'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx-vnni-int8'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bhi-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='bus-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cmpccxadd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fbsdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='fsrs'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ibrs-all'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='intel-psfd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ipred-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='lam'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mcdt-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pbrsb-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='psdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rfds-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rrsba-ctrl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='sbdr-ssdp-no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='serialize'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vaes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='vpclmulqdq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Client-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='hle'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='rtm'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Skylake-Server-v5'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512bw'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512cd'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512dq'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512f'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='avx512vl'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='invpcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pcid'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='pku'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='mpx'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v2'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v3'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='core-capability'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='split-lock-detect'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='Snowridge-v4'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='cldemote'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='erms'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='gfni'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdir64b'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='movdiri'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='xsaves'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='athlon-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='core2duo-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='coreduo-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='n270-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='ss'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <blockers model='phenom-v1'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnow'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <feature name='3dnowext'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </blockers>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </mode>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </cpu>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <memoryBacking supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <enum name='sourceType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>file</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>anonymous</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <value>memfd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </memoryBacking>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <devices>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <disk supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='diskDevice'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>disk</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>cdrom</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>floppy</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>lun</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>fdc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>sata</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <graphics supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vnc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>egl-headless</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </graphics>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <video supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='modelType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vga</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>cirrus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>none</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>bochs</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>ramfb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </video>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <hostdev supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='mode'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>subsystem</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='startupPolicy'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>mandatory</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>requisite</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>optional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='subsysType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pci</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>scsi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='capsType'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='pciBackend'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </hostdev>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <rng supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtio-non-transitional</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>random</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>egd</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </rng>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <filesystem supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='driverType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>path</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>handle</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>virtiofs</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </filesystem>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <tpm supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tpm-tis</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tpm-crb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>emulator</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>external</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendVersion'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>2.0</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </tpm>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <redirdev supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='bus'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>usb</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </redirdev>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <channel supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </channel>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <crypto supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>qemu</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendModel'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>builtin</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </crypto>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <interface supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='backendType'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>default</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>passt</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </interface>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <panic supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='model'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>isa</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>hyperv</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </panic>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <console supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='type'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>null</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vc</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pty</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dev</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>file</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>pipe</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>stdio</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>udp</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tcp</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>unix</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>qemu-vdagent</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>dbus</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </console>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </devices>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   <features>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <gic supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <vmcoreinfo supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <genid supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <backingStoreInput supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <backup supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <async-teardown supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <s390-pv supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <ps2 supported='yes'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <tdx supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <sev supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <sgx supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <hyperv supported='yes'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <enum name='features'>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>relaxed</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vapic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>spinlocks</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vpindex</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>runtime</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>synic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>stimer</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>reset</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>vendor_id</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>frequencies</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>reenlightenment</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>tlbflush</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>ipi</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>avic</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>emsr_bitmap</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <value>xmm_input</value>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </enum>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       <defaults>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <spinlocks>4095</spinlocks>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <stimer_direct>on</stimer_direct>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <tlbflush_direct>off</tlbflush_direct>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <tlbflush_extended>off</tlbflush_extended>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:       </defaults>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     </hyperv>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:     <launchSecurity supported='no'/>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:   </features>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: </domainCapabilities>
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.017 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.018 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.021 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.021 282211 INFO nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Secure Boot support detected
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.024 282211 INFO nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.024 282211 INFO nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.038 282211 DEBUG nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.076 282211 INFO nova.virt.node [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.096 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 23 09:39:10 np0005626463.localdomain rsyslogd[758]: imjournal from <localhost:nova_compute>: begin to drop messages due to rate-limiting
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.131 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.136 282211 DEBUG nova.virt.libvirt.vif [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005626463.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.136 282211 DEBUG nova.network.os_vif_util [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.137 282211 DEBUG nova.network.os_vif_util [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.138 282211 DEBUG os_vif [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.181 282211 DEBUG ovsdbapp.backend.ovs_idl [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.181 282211 DEBUG ovsdbapp.backend.ovs_idl [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.182 282211 DEBUG ovsdbapp.backend.ovs_idl [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.182 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.182 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.183 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.184 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.185 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.188 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.203 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.204 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.204 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.205 282211 INFO oslo.privsep.daemon [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp5o6kqvot/privsep.sock']
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.808 282211 INFO oslo.privsep.daemon [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.703 282522 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.708 282522 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.711 282522 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 23 09:39:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:10.711 282522 INFO oslo.privsep.daemon [-] privsep daemon running as pid 282522
Feb 23 09:39:10 np0005626463.localdomain systemd[1]: tmp-crun.Pu52VL.mount: Deactivated successfully.
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.092 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.093 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.094 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.095 282211 INFO os_vif [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.095 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.099 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.099 282211 INFO nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.176 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.176 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.177 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.177 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.178 282211 DEBUG oslo_concurrency.processutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.632 282211 DEBUG oslo_concurrency.processutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.676 282211 DEBUG nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.676 282211 DEBUG nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.824 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.934 282211 WARNING nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.936 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12185MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.937 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:11.937 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.104 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.105 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.105 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.152 282211 DEBUG nova.scheduler.client.report [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:39:12 np0005626463.localdomain sshd[282548]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.177 282211 DEBUG nova.scheduler.client.report [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.178 282211 DEBUG nova.compute.provider_tree [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.196 282211 DEBUG nova.scheduler.client.report [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.222 282211 DEBUG nova.scheduler.client.report [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.260 282211 DEBUG oslo_concurrency.processutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:39:12 np0005626463.localdomain sshd[282548]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:39:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.688 282211 DEBUG oslo_concurrency.processutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.694 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.694 282211 INFO nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] kernel doesn't support AMD SEV
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.696 282211 DEBUG nova.compute.provider_tree [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.697 282211 DEBUG nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 09:39:12 np0005626463.localdomain podman[282570]: 2026-02-23 09:39:12.709234007 +0000 UTC m=+0.080879854 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 23 09:39:12 np0005626463.localdomain podman[282570]: 2026-02-23 09:39:12.720484668 +0000 UTC m=+0.092130565 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.721 282211 DEBUG nova.scheduler.client.report [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:39:12 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.760 282211 DEBUG nova.compute.resource_tracker [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.761 282211 DEBUG oslo_concurrency.lockutils [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.761 282211 DEBUG nova.service [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.788 282211 DEBUG nova.service [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 23 09:39:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:12.789 282211 DEBUG nova.servicegroup.drivers.db [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = <Service: host=np0005626463.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 23 09:39:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:39:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:39:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:39:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:39:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:39:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:39:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:15.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:16.861 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:17 np0005626463.localdomain sshd[282590]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:39:18 np0005626463.localdomain sshd[282590]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:39:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:39:18 np0005626463.localdomain podman[282592]: 2026-02-23 09:39:18.492816579 +0000 UTC m=+0.073168234 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216)
Feb 23 09:39:18 np0005626463.localdomain podman[282592]: 2026-02-23 09:39:18.527303614 +0000 UTC m=+0.107655269 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 09:39:18 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.791 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.814 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.816 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:39:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:19.846 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:20.244 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:21.892 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:23 np0005626463.localdomain sshd[282610]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:39:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49152 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD87400000000001030307) 
Feb 23 09:39:24 np0005626463.localdomain sshd[282610]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:39:24 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:24.559 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:39:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:24.560 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:24 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:24.560 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:39:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49153 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD8B460000000001030307) 
Feb 23 09:39:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:25.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:39:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57425 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD8E060000000001030307) 
Feb 23 09:39:25 np0005626463.localdomain podman[282612]: 2026-02-23 09:39:25.911540394 +0000 UTC m=+0.079977326 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:39:25 np0005626463.localdomain podman[282612]: 2026-02-23 09:39:25.921313929 +0000 UTC m=+0.089750901 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:39:25 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:39:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:26.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49154 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD93460000000001030307) 
Feb 23 09:39:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23724 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD98060000000001030307) 
Feb 23 09:39:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:30.309 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49155 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFDA3060000000001030307) 
Feb 23 09:39:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:31.958 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:33.563 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:39:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:39:33 np0005626463.localdomain podman[282635]: 2026-02-23 09:39:33.903945746 +0000 UTC m=+0.079923295 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.7, name=ubi9/ubi-minimal, distribution-scope=public, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 23 09:39:33 np0005626463.localdomain podman[282635]: 2026-02-23 09:39:33.915224278 +0000 UTC m=+0.091201837 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, vcs-type=git, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:39:33 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:39:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:35.350 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:36.984 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:39:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:39:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:39:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 23 09:39:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:39:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16787 "" "Go-http-client/1.1"
Feb 23 09:39:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49156 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFDC4060000000001030307) 
Feb 23 09:39:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:40.257 282211 DEBUG nova.compute.manager [None req-1ddbaaa3-8e2d-4138-b94f-ee4dd68d62ad cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:40.263 282211 INFO nova.compute.manager [None req-1ddbaaa3-8e2d-4138-b94f-ee4dd68d62ad cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Retrieving diagnostics
Feb 23 09:39:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:40.352 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:39:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:39:40 np0005626463.localdomain podman[282656]: 2026-02-23 09:39:40.921352078 +0000 UTC m=+0.086667195 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:39:40 np0005626463.localdomain podman[282656]: 2026-02-23 09:39:40.960221081 +0000 UTC m=+0.125536178 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:39:40 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:39:41 np0005626463.localdomain podman[282655]: 2026-02-23 09:39:41.014406221 +0000 UTC m=+0.184460226 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:39:41 np0005626463.localdomain podman[282655]: 2026-02-23 09:39:41.109821837 +0000 UTC m=+0.279875872 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:39:41 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:39:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:41.987 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:39:42 np0005626463.localdomain podman[282704]: 2026-02-23 09:39:42.902649268 +0000 UTC m=+0.079984656 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:39:42 np0005626463.localdomain podman[282704]: 2026-02-23 09:39:42.918234305 +0000 UTC m=+0.095569713 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:39:42 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:39:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:39:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:39:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:39:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:39:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:39:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:39:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:45.354 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.000 282211 DEBUG oslo_concurrency.lockutils [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.000 282211 DEBUG oslo_concurrency.lockutils [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.001 282211 DEBUG nova.compute.manager [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.005 282211 DEBUG nova.compute.manager [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.010 282211 DEBUG nova.objects.instance [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'flavor' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:47.052 282211 DEBUG nova.virt.libvirt.driver [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Feb 23 09:39:47 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 23 09:39:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:48.541 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:48.541 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:48.543 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:39:48 np0005626463.localdomain systemd[1]: tmp-crun.XnOqTe.mount: Deactivated successfully.
Feb 23 09:39:48 np0005626463.localdomain podman[282723]: 2026-02-23 09:39:48.954756204 +0000 UTC m=+0.127308233 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:39:48 np0005626463.localdomain podman[282723]: 2026-02-23 09:39:48.960036188 +0000 UTC m=+0.132588247 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 09:39:48 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:39:49 np0005626463.localdomain kernel: device tapa27e5011-20 left promiscuous mode
Feb 23 09:39:49 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839589.5997] device (tapa27e5011-20): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00052|binding|INFO|Releasing lport a27e5011-2016-4b16-b5e8-04b555b30bc4 from this chassis (sb_readonly=0)
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00053|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 down in Southbound
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00054|binding|INFO|Removing iface tapa27e5011-20 ovn-installed in OVS
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.614 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00055|ovn_bfd|INFO|Disabled BFD on interface ovn-5b0126-0
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00056|ovn_bfd|INFO|Disabled BFD on interface ovn-585d62-0
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00057|ovn_bfd|INFO|Disabled BFD on interface ovn-b9c72d-0
Feb 23 09:39:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:49.619 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:9d:00 192.168.0.12'], port_security=['fa:16:3e:a0:9d:00 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005626463.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '18508c14-7c5f-4fc2-8d9a-66df41a4ab8c ef2f14d6-40b1-49a6-83d1-89d52b525905', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1694950-12d2-4254-85f1-37700098294d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a27e5011-2016-4b16-b5e8-04b555b30bc4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:39:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:49.621 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a27e5011-2016-4b16-b5e8-04b555b30bc4 in datapath 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d unbound from our chassis
Feb 23 09:39:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:49.623 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.626 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00058|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:39:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:49.629 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[fbae2b3b-779d-439f-8f13-4678862ff373]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:39:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:39:49.629 163572 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d namespace which is not needed anymore
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.632 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:49 np0005626463.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 23 09:39:49 np0005626463.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 3min 52.695s CPU time.
Feb 23 09:39:49 np0005626463.localdomain systemd-machined[84014]: Machine qemu-1-instance-00000003 terminated.
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.682 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:39:49Z|00059|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.687 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.827 282211 DEBUG nova.compute.manager [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received event network-vif-unplugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.830 282211 DEBUG oslo_concurrency.lockutils [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.830 282211 DEBUG oslo_concurrency.lockutils [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.830 282211 DEBUG oslo_concurrency.lockutils [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.830 282211 DEBUG nova.compute.manager [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] No waiting events found dispatching network-vif-unplugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 09:39:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:49.831 282211 WARNING nova.compute.manager [req-74044e2f-843c-452b-b512-27edc7cc69ef req-d51fe0ea-7a20-4877-b3d3-f57527cde7fb 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received unexpected event network-vif-unplugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 for instance with vm_state active and task_state powering-off.
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.070 282211 INFO nova.virt.libvirt.driver [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Instance shutdown successfully after 3 seconds.
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.076 282211 INFO nova.virt.libvirt.driver [-] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Instance destroyed successfully.
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.077 282211 DEBUG nova.objects.instance [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'numa_topology' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.097 282211 DEBUG nova.compute.manager [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.210 282211 DEBUG oslo_concurrency.lockutils [None req-b4e74f4d-621b-491a-aee3-bf41adceaf3a cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:50.356 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.854 282211 DEBUG nova.compute.manager [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.855 282211 DEBUG oslo_concurrency.lockutils [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.855 282211 DEBUG oslo_concurrency.lockutils [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.856 282211 DEBUG oslo_concurrency.lockutils [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.856 282211 DEBUG nova.compute.manager [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] No waiting events found dispatching network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 09:39:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:51.857 282211 WARNING nova.compute.manager [req-9655e876-19ac-44ea-8475-623b98103bfd req-e00d5ff0-7787-4cf5-a0c9-4f2e0c6fe8dd 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received unexpected event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 for instance with vm_state stopped and task_state None.
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.896 282211 DEBUG nova.compute.manager [None req-34d0ee8e-9658-4e92-b294-b911ed421ce1 cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server [None req-34d0ee8e-9658-4e92-b294-b911ed421ce1 cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     raise self.value
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     raise self.value
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 23 09:39:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:52.919 282211 ERROR oslo_messaging.rpc.server 
Feb 23 09:39:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29801 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFDFC700000000001030307) 
Feb 23 09:39:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29802 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE00860000000001030307) 
Feb 23 09:39:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:55.400 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:56 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49157 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE04060000000001030307) 
Feb 23 09:39:56 np0005626463.localdomain sshd[282793]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:39:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:39:56 np0005626463.localdomain podman[282795]: 2026-02-23 09:39:56.916721265 +0000 UTC m=+0.088025638 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:39:56 np0005626463.localdomain podman[282795]: 2026-02-23 09:39:56.925304423 +0000 UTC m=+0.096608826 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:39:56 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:39:57 np0005626463.localdomain sshd[282793]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:39:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:39:57.101 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:39:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29803 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE08860000000001030307) 
Feb 23 09:39:57 np0005626463.localdomain sshd[282819]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:39:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57426 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE0C070000000001030307) 
Feb 23 09:39:58 np0005626463.localdomain sshd[282819]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:39:59 np0005626463.localdomain systemd[1]: libpod-f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b.scope: Deactivated successfully.
Feb 23 09:39:59 np0005626463.localdomain podman[282769]: 2026-02-23 09:39:59.879305549 +0000 UTC m=+10.098954800 container died f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 09:40:00 np0005626463.localdomain podman[282769]: 2026-02-23 09:40:00.056007033 +0000 UTC m=+10.275656224 container cleanup f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510)
Feb 23 09:40:00 np0005626463.localdomain podman[282822]: 2026-02-23 09:40:00.070064179 +0000 UTC m=+0.178665806 container cleanup f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 23 09:40:00 np0005626463.localdomain systemd[1]: libpod-conmon-f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b.scope: Deactivated successfully.
Feb 23 09:40:00 np0005626463.localdomain podman[282840]: 2026-02-23 09:40:00.158981919 +0000 UTC m=+0.075474123 container remove f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.168 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d82fc3c7-9e37-43a3-8838-d6bedcc436ae]: (4, ('Mon Feb 23 09:39:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d (f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b)\nf1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b\nMon Feb 23 09:40:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d (f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b)\nf1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b\n', 'time="2026-02-23T09:39:59Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.172 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0133f8eb-567c-4a4d-a4e3-fa7a6242341d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.173 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da5b53d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:00.206 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:00 np0005626463.localdomain kernel: device tap9da5b53d-30 left promiscuous mode
Feb 23 09:40:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:00.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.221 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5e765676-701f-4eb0-8253-02568fd0cac3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.233 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[15190387-dd96-46cc-a24f-b6c0e0421bfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.234 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[88d8a29f-f1de-4940-b742-4f8237da21c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.244 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d850be-bb7d-4764-a4e2-51b394055d16]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 643121, 'reachable_time': 22230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282861, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.261 163964 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 23 09:40:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:00.262 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[1d71c8bd-21d1-4546-8284-2bb177c177f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:00.402 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:00 np0005626463.localdomain systemd[1]: tmp-crun.VfRO0y.mount: Deactivated successfully.
Feb 23 09:40:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f3b968a22d6dac5274c225974669ffbf9fd10e196a31be0e89003b3aedfce825-merged.mount: Deactivated successfully.
Feb 23 09:40:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b-userdata-shm.mount: Deactivated successfully.
Feb 23 09:40:00 np0005626463.localdomain systemd[1]: run-netns-ovnmeta\x2d9da5b53d\x2d3184\x2d450f\x2d9a5b\x2dbdba1a6c9f6d.mount: Deactivated successfully.
Feb 23 09:40:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29804 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE18470000000001030307) 
Feb 23 09:40:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:02.139 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:02 np0005626463.localdomain sshd[282864]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:40:03 np0005626463.localdomain sshd[282864]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:40:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:40:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:04.839 282211 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771839589.8384318, c2a7d92b-952f-46a7-8a6a-3322a48fcf4b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 09:40:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:04.840 282211 INFO nova.compute.manager [-] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] VM Stopped (Lifecycle Event)
Feb 23 09:40:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:04.860 282211 DEBUG nova.compute.manager [None req-9130ae8a-ae34-45dd-9227-660ec3fa69b7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:04.865 282211 DEBUG nova.compute.manager [None req-9130ae8a-ae34-45dd-9227-660ec3fa69b7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 09:40:04 np0005626463.localdomain systemd[1]: tmp-crun.YdUlzp.mount: Deactivated successfully.
Feb 23 09:40:04 np0005626463.localdomain podman[282866]: 2026-02-23 09:40:04.913238588 +0000 UTC m=+0.086036861 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, release=1770267347, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, distribution-scope=public)
Feb 23 09:40:04 np0005626463.localdomain podman[282866]: 2026-02-23 09:40:04.929349448 +0000 UTC m=+0.102147781 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:40:04 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:40:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:05.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:07.177 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:08 np0005626463.localdomain sudo[282886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:40:08 np0005626463.localdomain sudo[282886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:40:08 np0005626463.localdomain sudo[282886]: pam_unix(sudo:session): session closed for user root
Feb 23 09:40:08 np0005626463.localdomain sudo[282904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:40:08 np0005626463.localdomain sudo[282904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:40:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:09.106 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:09.106 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:09.107 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:40:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:09.107 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:40:09 np0005626463.localdomain sudo[282904]: pam_unix(sudo:session): session closed for user root
Feb 23 09:40:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29805 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE38060000000001030307) 
Feb 23 09:40:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:40:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:40:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:40:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147348 "" "Go-http-client/1.1"
Feb 23 09:40:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:40:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16312 "" "Go-http-client/1.1"
Feb 23 09:40:09 np0005626463.localdomain sudo[282953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:40:09 np0005626463.localdomain sudo[282953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:40:09 np0005626463.localdomain sudo[282953]: pam_unix(sudo:session): session closed for user root
Feb 23 09:40:09 np0005626463.localdomain sudo[282971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 09:40:09 np0005626463.localdomain sudo[282971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.211899902 +0000 UTC m=+0.076973340 container create 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.181151308 +0000 UTC m=+0.046224816 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895.scope.
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.322444922 +0000 UTC m=+0.187518380 container init 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.331248195 +0000 UTC m=+0.196321633 container start 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, name=rhceph, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.33173443 +0000 UTC m=+0.196807868 container attach 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.42.2, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Feb 23 09:40:10 np0005626463.localdomain jolly_khorana[283044]: 167 167
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: libpod-88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895.scope: Deactivated successfully.
Feb 23 09:40:10 np0005626463.localdomain podman[283029]: 2026-02-23 09:40:10.336739466 +0000 UTC m=+0.201812894 container died 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.406 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:10 np0005626463.localdomain podman[283049]: 2026-02-23 09:40:10.429853215 +0000 UTC m=+0.084361359 container remove 88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_khorana, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., release=1770267347, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: libpod-conmon-88ba0a31e4645b8f92dcaa25d561e540c7f7a21f474e78e567e52f715c2fc895.scope: Deactivated successfully.
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.597 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.598 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.598 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.599 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 2026-02-23 09:40:10.624349702 +0000 UTC m=+0.076165655 container create fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=)
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65.scope.
Feb 23 09:40:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:40:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4938d3353c8d67acbee5155144060c6cb600aad0564e87bd4775f1129450abd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 09:40:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4938d3353c8d67acbee5155144060c6cb600aad0564e87bd4775f1129450abd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:40:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4938d3353c8d67acbee5155144060c6cb600aad0564e87bd4775f1129450abd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 09:40:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4938d3353c8d67acbee5155144060c6cb600aad0564e87bd4775f1129450abd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 2026-02-23 09:40:10.688262026 +0000 UTC m=+0.140077979 container init fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=)
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 2026-02-23 09:40:10.595951661 +0000 UTC m=+0.047767614 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 2026-02-23 09:40:10.69842047 +0000 UTC m=+0.150236423 container start fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, release=1770267347, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc.)
Feb 23 09:40:10 np0005626463.localdomain podman[283071]: 2026-02-23 09:40:10.698695129 +0000 UTC m=+0.150511122 container attach fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, ceph=True)
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.969 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.997 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.998 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.999 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:10.999 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.000 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.000 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.000 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.001 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.002 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.002 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.028 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.029 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.029 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.029 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.030 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:40:11 np0005626463.localdomain podman[283101]: 2026-02-23 09:40:11.181201894 +0000 UTC m=+0.094996909 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-700b97ceb2cbb356b789ac9537ae2c72b3f967a1d6a727b059b40411a7d5bd36-merged.mount: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain podman[283101]: 2026-02-23 09:40:11.222781314 +0000 UTC m=+0.136576369 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain podman[283153]: 2026-02-23 09:40:11.306036358 +0000 UTC m=+0.082745149 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:40:11 np0005626463.localdomain podman[283153]: 2026-02-23 09:40:11.402349997 +0000 UTC m=+0.179058748 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.494 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.545 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.546 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]: [
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:     {
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "available": false,
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "ceph_device": false,
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "lsm_data": {},
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "lvs": [],
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "path": "/dev/sr0",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "rejected_reasons": [
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "Insufficient space (<5GB)",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "Has a FileSystem"
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         ],
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         "sys_api": {
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "actuators": null,
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "device_nodes": "sr0",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "human_readable_size": "482.00 KB",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "id_bus": "ata",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "model": "QEMU DVD-ROM",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "nr_requests": "2",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "partitions": {},
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "path": "/dev/sr0",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "removable": "1",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "rev": "2.5+",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "ro": "0",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "rotational": "1",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "sas_address": "",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "sas_device_handle": "",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "scheduler_mode": "mq-deadline",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "sectors": 0,
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "sectorsize": "2048",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "size": 493568.0,
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "support_discard": "0",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "type": "disk",
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:             "vendor": "QEMU"
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:         }
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]:     }
Feb 23 09:40:11 np0005626463.localdomain hungry_wescoff[283086]: ]
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: libpod-fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65.scope: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.703 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.704 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12525MB free_disk=41.836727142333984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.704 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.704 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:40:11 np0005626463.localdomain podman[284893]: 2026-02-23 09:40:11.720194791 +0000 UTC m=+0.056652479 container died fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b4938d3353c8d67acbee5155144060c6cb600aad0564e87bd4775f1129450abd-merged.mount: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain podman[284893]: 2026-02-23 09:40:11.757988014 +0000 UTC m=+0.094445672 container remove fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wescoff, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git)
Feb 23 09:40:11 np0005626463.localdomain systemd[1]: libpod-conmon-fbd159bebd3e63ea983491a72ffbb9e153cbd038e24060d66a16bd725f7cfb65.scope: Deactivated successfully.
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.769 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.770 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.770 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:40:11 np0005626463.localdomain sudo[282971]: pam_unix(sudo:session): session closed for user root
Feb 23 09:40:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:11.813 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:12 np0005626463.localdomain sudo[284929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:40:12 np0005626463.localdomain sudo[284929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:40:12 np0005626463.localdomain sudo[284929]: pam_unix(sudo:session): session closed for user root
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.262 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.269 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.288 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.314 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:40:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:12.314 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:40:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:40:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:40:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:40:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:40:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:40:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.728 282211 DEBUG nova.compute.manager [None req-9934a4b8-a3d9-46de-b80c-4d348a41d6ff cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server [None req-9934a4b8-a3d9-46de-b80c-4d348a41d6ff cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     raise self.value
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     raise self.value
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 23 09:40:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:13.767 282211 ERROR oslo_messaging.rpc.server 
Feb 23 09:40:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:40:13 np0005626463.localdomain systemd[1]: tmp-crun.Kn6QSX.mount: Deactivated successfully.
Feb 23 09:40:13 np0005626463.localdomain podman[284949]: 2026-02-23 09:40:13.924303556 +0000 UTC m=+0.097275020 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute)
Feb 23 09:40:13 np0005626463.localdomain podman[284949]: 2026-02-23 09:40:13.938261939 +0000 UTC m=+0.111233373 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:40:13 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:40:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:15.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:17.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:19 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:19Z|00060|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 23 09:40:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:40:19 np0005626463.localdomain podman[284970]: 2026-02-23 09:40:19.896345099 +0000 UTC m=+0.076335350 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:40:19 np0005626463.localdomain podman[284970]: 2026-02-23 09:40:19.927526717 +0000 UTC m=+0.107516938 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Feb 23 09:40:19 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:40:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:20.412 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.177 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'flavor' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.200 282211 DEBUG oslo_concurrency.lockutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.200 282211 DEBUG oslo_concurrency.lockutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.200 282211 DEBUG nova.network.neutron [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.201 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.920 282211 DEBUG nova.network.neutron [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.934 282211 DEBUG oslo_concurrency.lockutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.965 282211 INFO nova.virt.libvirt.driver [-] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Instance destroyed successfully.
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.966 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'numa_topology' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.978 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'resources' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.991 282211 DEBUG nova.virt.libvirt.vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005626463.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T09:39:50Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.992 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.993 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.994 282211 DEBUG os_vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:21.997 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa27e5011-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.040 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.041 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.044 282211 INFO os_vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.047 282211 DEBUG nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.047 282211 INFO nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] UEFI support detected
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.055 282211 DEBUG nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Start _get_guest_xml network_info=[{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a9204248-210d-45b5-ab0a-d1ec08a73a4f,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'size': 0, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'image_id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}], 'ephemerals': [{'encrypted': False, 'size': 1, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'device_name': '/dev/vdb', 'encryption_secret_uuid': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.060 282211 WARNING nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.063 282211 DEBUG nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Searching host: 'np0005626463.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.063 282211 DEBUG nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.066 282211 DEBUG nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Searching host: 'np0005626463.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.067 282211 DEBUG nova.virt.libvirt.host [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.068 282211 DEBUG nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.068 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-23T08:22:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='c13b1f72-534e-4f1d-8659-0e8f3a2c7d53',id=3,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a9204248-210d-45b5-ab0a-d1ec08a73a4f,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.069 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.070 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.070 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.071 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.071 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.072 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.072 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.073 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.073 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.074 282211 DEBUG nova.virt.hardware [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.074 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'vcpu_model' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.136 282211 DEBUG nova.privsep.utils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.137 282211 DEBUG oslo_concurrency.processutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.220 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.594 282211 DEBUG oslo_concurrency.processutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:40:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:22.595 282211 DEBUG oslo_concurrency.processutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.009 282211 DEBUG oslo_concurrency.processutils [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.014 282211 DEBUG nova.virt.libvirt.vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005626463.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T09:39:50Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.015 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.017 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.020 282211 DEBUG nova.objects.instance [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Lazy-loading 'pci_devices' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.042 282211 DEBUG nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] End _get_guest_xml xml=<domain type="kvm">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <uuid>c2a7d92b-952f-46a7-8a6a-3322a48fcf4b</uuid>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <name>instance-00000003</name>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <memory>524288</memory>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <vcpu>1</vcpu>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <metadata>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:name>test</nova:name>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:creationTime>2026-02-23 09:40:22</nova:creationTime>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:flavor name="m1.small">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:memory>512</nova:memory>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:disk>1</nova:disk>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:swap>0</nova:swap>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:ephemeral>1</nova:ephemeral>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:vcpus>1</nova:vcpus>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </nova:flavor>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:owner>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:user uuid="cb6895487918456aa599ca2f76872d00">admin</nova:user>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:project uuid="37b8098efb0d4ecc90b451a2db0e966f">admin</nova:project>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </nova:owner>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:root type="image" uuid="a9204248-210d-45b5-ab0a-d1ec08a73a4f"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <nova:ports>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <nova:port uuid="a27e5011-2016-4b16-b5e8-04b555b30bc4">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:           <nova:ip type="fixed" address="192.168.0.12" ipVersion="4"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         </nova:port>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </nova:ports>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </nova:instance>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </metadata>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <sysinfo type="smbios">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <system>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="manufacturer">RDO</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="product">OpenStack Compute</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="serial">c2a7d92b-952f-46a7-8a6a-3322a48fcf4b</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="uuid">c2a7d92b-952f-46a7-8a6a-3322a48fcf4b</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <entry name="family">Virtual Machine</entry>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </system>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </sysinfo>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <os>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <boot dev="hd"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <smbios mode="sysinfo"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </os>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <features>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <acpi/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <apic/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <vmcoreinfo/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </features>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <clock offset="utc">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <timer name="pit" tickpolicy="delay"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <timer name="hpet" present="no"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </clock>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <cpu mode="host-model" match="exact">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <topology sockets="1" cores="1" threads="1"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </cpu>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   <devices>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <disk type="network" device="disk">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <driver type="raw" cache="none"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <source protocol="rbd" name="vms/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b_disk">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.103" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.105" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.104" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </source>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <auth username="openstack">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <secret type="ceph" uuid="f1fea371-cb69-578d-a3d0-b5c472a84b46"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </auth>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <target dev="vda" bus="virtio"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <disk type="network" device="disk">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <driver type="raw" cache="none"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <source protocol="rbd" name="vms/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b_disk.eph0">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.103" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.105" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <host name="172.18.0.104" port="6789"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </source>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <auth username="openstack">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:         <secret type="ceph" uuid="f1fea371-cb69-578d-a3d0-b5c472a84b46"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       </auth>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <target dev="vdb" bus="virtio"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </disk>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <interface type="ethernet">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <mac address="fa:16:3e:a0:9d:00"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <model type="virtio"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <driver name="vhost" rx_queue_size="512"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <mtu size="1292"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <target dev="tapa27e5011-20"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </interface>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <serial type="pty">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <log file="/var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/console.log" append="off"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </serial>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <video>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <model type="virtio"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </video>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <input type="tablet" bus="usb"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <input type="keyboard" bus="usb"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <rng model="virtio">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <backend model="random">/dev/urandom</backend>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </rng>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="pci" model="pcie-root-port"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <controller type="usb" index="0"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     <memballoon model="virtio">
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:       <stats period="10"/>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:     </memballoon>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:   </devices>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: </domain>
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.044 282211 DEBUG nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.045 282211 DEBUG nova.virt.libvirt.driver [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.046 282211 DEBUG nova.virt.libvirt.vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005626463.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-23T09:39:50Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.046 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.046 282211 DEBUG nova.network.os_vif_util [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.047 282211 DEBUG os_vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.048 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.048 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.051 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.051 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.052 282211 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.058 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.059 282211 INFO os_vif [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')
Feb 23 09:40:23 np0005626463.localdomain systemd[1]: Started libvirt secret daemon.
Feb 23 09:40:23 np0005626463.localdomain kernel: device tapa27e5011-20 entered promiscuous mode
Feb 23 09:40:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839623.1769] manager: (tapa27e5011-20): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Feb 23 09:40:23 np0005626463.localdomain systemd-udevd[285061]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:40:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839623.1932] device (tapa27e5011-20): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 23 09:40:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839623.1937] device (tapa27e5011-20): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.202 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00061|binding|INFO|Claiming lport a27e5011-2016-4b16-b5e8-04b555b30bc4 for this chassis.
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00062|binding|INFO|a27e5011-2016-4b16-b5e8-04b555b30bc4: Claiming fa:16:3e:a0:9d:00 192.168.0.12
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.212 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.217 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:9d:00 192.168.0.12'], port_security=['fa:16:3e:a0:9d:00 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '18508c14-7c5f-4fc2-8d9a-66df41a4ab8c ef2f14d6-40b1-49a6-83d1-89d52b525905', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1694950-12d2-4254-85f1-37700098294d, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a27e5011-2016-4b16-b5e8-04b555b30bc4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.218 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a27e5011-2016-4b16-b5e8-04b555b30bc4 in datapath 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d bound to our chassis
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.221 163572 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00063|ovn_bfd|INFO|Enabled BFD on interface ovn-5b0126-0
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00064|ovn_bfd|INFO|Enabled BFD on interface ovn-585d62-0
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00065|ovn_bfd|INFO|Enabled BFD on interface ovn-b9c72d-0
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.224 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.230 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd5ec27-1d8a-4b6e-b057-7eca73fde0c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.231 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9da5b53d-31 in ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.232 163675 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9da5b53d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.232 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dd407de2-b1d3-4769-bcd9-3cd036572101]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.233 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[36c25309-0e19-447a-8bb4-1f7758c0e253]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.249 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e9a845-adc1-4188-b350-2cd57dffc0cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain systemd-machined[84014]: New machine qemu-2-instance-00000003.
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.259 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.260 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00066|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 ovn-installed in OVS
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00067|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 up in Southbound
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.275 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.283 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.285 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b58da045-1ddc-48a7-9f46-e500e27e5fb7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.297 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.308 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[eac19b39-e20e-4460-83bc-e876cc021702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.313 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[315de9cc-5985-4af4-9a78-7a0663ee4675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain systemd-udevd[285064]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:40:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839623.3145] manager: (tap9da5b53d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/16)
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.346 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e8ae65-e97b-4611-9f54-22bb22268de2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.349 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f0e825-8ea2-4780-8993-3765dc211d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-31: link becomes ready
Feb 23 09:40:23 np0005626463.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-30: link becomes ready
Feb 23 09:40:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771839623.3690] device (tap9da5b53d-30): carrier: link connected
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.373 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8817f4-f26b-41fc-9c25-d226723a0db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.390 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dde3ccd2-e0d7-4a9a-9522-7578c75f1ea8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da5b53d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c8:0e:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1106348, 'reachable_time': 40113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285114, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.404 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf2bfa3-aa39-4bc3-b013-42596b67f220]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:e6f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1106348, 'tstamp': 1106348}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285117, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.418 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[23f72374-874d-450c-9af3-a5b290c2881f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da5b53d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c8:0e:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1106348, 'reachable_time': 40113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285120, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.440 282211 DEBUG nova.compute.manager [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.441 282211 DEBUG oslo_concurrency.lockutils [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.442 282211 DEBUG oslo_concurrency.lockutils [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.442 282211 DEBUG oslo_concurrency.lockutils [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.442 282211 DEBUG nova.compute.manager [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] No waiting events found dispatching network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.443 282211 WARNING nova.compute.manager [req-22758e71-5da2-4d8d-91b9-02ab75d7b1ce req-d5ec0581-56e8-4768-b7de-d810a8334454 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received unexpected event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 for instance with vm_state stopped and task_state powering-on.
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.446 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ad5dc2-709c-482a-9b69-0900cf242769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.507 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b10bf5fe-5d5c-4583-8293-2e8a3ac4d742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.509 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da5b53d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.509 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.509 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9da5b53d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.511 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain kernel: device tap9da5b53d-30 entered promiscuous mode
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.514 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9da5b53d-30, col_values=(('external_ids', {'iface-id': '4143c8ea-7577-4792-9744-bcff90eb20f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:40:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:23Z|00068|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.524 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.525 163572 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.526 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccb4cbd-bfc6-4c26-b0fb-865a03d1ff42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.527 163572 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: global
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     log         /dev/log local0 debug
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     log-tag     haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     user        root
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     group       root
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     maxconn     1024
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     pidfile     /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     daemon
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: defaults
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     log global
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     mode http
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     option httplog
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     option dontlognull
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     option http-server-close
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     option forwardfor
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     retries                 3
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     timeout http-request    30s
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     timeout connect         30s
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     timeout client          32s
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     timeout server          32s
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     timeout http-keep-alive 30s
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: listen listener
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     bind 169.254.169.254:80
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     server metadata /var/lib/neutron/metadata_proxy
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:     http-request add-header X-OVN-Network-ID 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 23 09:40:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:23.528 163572 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'env', 'PROCESS_TAG=haproxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.593 282211 DEBUG nova.virt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Emitting event <LifecycleEvent: 1771839623.591128, c2a7d92b-952f-46a7-8a6a-3322a48fcf4b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.593 282211 INFO nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] VM Resumed (Lifecycle Event)
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.596 282211 DEBUG nova.compute.manager [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.600 282211 INFO nova.virt.libvirt.driver [-] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Instance rebooted successfully.
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.601 282211 DEBUG nova.compute.manager [None req-48eeab65-47b9-4b45-ac13-90d08f01960e cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.672 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.676 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.721 282211 INFO nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] During sync_power_state the instance has a pending task (powering-on). Skip.
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.721 282211 DEBUG nova.virt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Emitting event <LifecycleEvent: 1771839623.5924237, c2a7d92b-952f-46a7-8a6a-3322a48fcf4b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.722 282211 INFO nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] VM Started (Lifecycle Event)
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.758 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:23.762 282211 DEBUG nova.compute.manager [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 23 09:40:23 np0005626463.localdomain snmpd[67690]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Feb 23 09:40:23 np0005626463.localdomain podman[285177]: 
Feb 23 09:40:23 np0005626463.localdomain podman[285177]: 2026-02-23 09:40:23.966984501 +0000 UTC m=+0.102581535 container create 983cee9e3a46a32c62717c639a883cc183806af2504f520a888928b166a0a907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 09:40:24 np0005626463.localdomain podman[285177]: 2026-02-23 09:40:23.916955928 +0000 UTC m=+0.052553002 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 23 09:40:24 np0005626463.localdomain systemd[1]: Started libpod-conmon-983cee9e3a46a32c62717c639a883cc183806af2504f520a888928b166a0a907.scope.
Feb 23 09:40:24 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:40:24 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/019f1d6ef1c9579eebcf889108ad4d8fd489aae5bb98f93096ed5b1aead5d346/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:40:24 np0005626463.localdomain podman[285177]: 2026-02-23 09:40:24.053816836 +0000 UTC m=+0.189413860 container init 983cee9e3a46a32c62717c639a883cc183806af2504f520a888928b166a0a907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:40:24 np0005626463.localdomain podman[285177]: 2026-02-23 09:40:24.063194466 +0000 UTC m=+0.198791490 container start 983cee9e3a46a32c62717c639a883cc183806af2504f520a888928b166a0a907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:40:24 np0005626463.localdomain neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285191]: [NOTICE]   (285195) : New worker (285197) forked
Feb 23 09:40:24 np0005626463.localdomain neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285191]: [NOTICE]   (285195) : Loading success.
Feb 23 09:40:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27317 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE71A00000000001030307) 
Feb 23 09:40:24 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:24Z|00069|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:40:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:24.105 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:24 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:24Z|00070|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:40:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:24.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:24 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:24Z|00071|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:40:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:24.297 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27318 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE75C70000000001030307) 
Feb 23 09:40:25 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:25Z|00072|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.485 282211 DEBUG nova.compute.manager [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.486 282211 DEBUG oslo_concurrency.lockutils [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.486 282211 DEBUG oslo_concurrency.lockutils [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.486 282211 DEBUG oslo_concurrency.lockutils [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.487 282211 DEBUG nova.compute.manager [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] No waiting events found dispatching network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 23 09:40:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:25.487 282211 WARNING nova.compute.manager [req-a7bb4b06-a6cb-4659-9d2f-0cb69cfae4fc req-4099e049-d38b-4b92-8545-ef9142fda2b5 422ccb105b4e4d80bb6030ca202e94d2 d351b5d019cd497ab1d84160f10b653c - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Received unexpected event network-vif-plugged-a27e5011-2016-4b16-b5e8-04b555b30bc4 for instance with vm_state active and task_state None.
Feb 23 09:40:25 np0005626463.localdomain sshd[285206]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:40:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29806 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE78060000000001030307) 
Feb 23 09:40:26 np0005626463.localdomain sshd[285206]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:40:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27319 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE7DC60000000001030307) 
Feb 23 09:40:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:27.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:40:27 np0005626463.localdomain systemd[1]: tmp-crun.eZa4eX.mount: Deactivated successfully.
Feb 23 09:40:27 np0005626463.localdomain podman[285208]: 2026-02-23 09:40:27.920085235 +0000 UTC m=+0.093167633 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:40:27 np0005626463.localdomain podman[285208]: 2026-02-23 09:40:27.932907572 +0000 UTC m=+0.105989910 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:40:27 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:40:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:28.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49158 DF PROTO=TCP SPT=34996 DPT=9102 SEQ=3629566089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE82060000000001030307) 
Feb 23 09:40:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27320 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFE8D860000000001030307) 
Feb 23 09:40:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:32.309 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:33.057 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:35Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:9d:00 192.168.0.12
Feb 23 09:40:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:40:35 np0005626463.localdomain podman[285230]: 2026-02-23 09:40:35.905094038 +0000 UTC m=+0.080264842 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, release=1770267347, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 23 09:40:35 np0005626463.localdomain podman[285230]: 2026-02-23 09:40:35.917664649 +0000 UTC m=+0.092835443 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 23 09:40:35 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:40:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:37.346 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:38.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:40:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:40:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:40:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1"
Feb 23 09:40:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:40:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1"
Feb 23 09:40:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27321 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEAE060000000001030307) 
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:39.821 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:39.823 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:39 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:40:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:40:41 np0005626463.localdomain podman[285249]: 2026-02-23 09:40:41.90937567 +0000 UTC m=+0.082551323 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_controller, tcib_managed=true)
Feb 23 09:40:41 np0005626463.localdomain podman[285250]: 2026-02-23 09:40:41.974543193 +0000 UTC m=+0.145344472 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:40:41 np0005626463.localdomain podman[285249]: 2026-02-23 09:40:41.975247494 +0000 UTC m=+0.148423137 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:40:41 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:40:42 np0005626463.localdomain podman[285250]: 2026-02-23 09:40:42.058191009 +0000 UTC m=+0.228992288 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:40:42 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:40:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:42.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.620 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.622 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 2.7990437
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51300 [23/Feb/2026:09:40:39.820] listener listener/metadata 0/0/0/2801/2801 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.637 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.638 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51316 [23/Feb/2026:09:40:42.636] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.658 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0205104
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.672 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.673 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.686 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51328 [23/Feb/2026:09:40:42.672] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.687 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0139225
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.693 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.693 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.705 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.705 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0119290
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51338 [23/Feb/2026:09:40:42.692] listener listener/metadata 0/0/0/12/12 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.711 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.712 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.726 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51340 [23/Feb/2026:09:40:42.711] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.726 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0141530
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.732 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.733 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.749 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51342 [23/Feb/2026:09:40:42.732] listener listener/metadata 0/0/0/17/17 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.749 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 148 time: 0.0164003
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.756 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.757 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.770 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51354 [23/Feb/2026:09:40:42.755] listener listener/metadata 0/0/0/15/15 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.771 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0141754
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.777 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.778 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.795 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51360 [23/Feb/2026:09:40:42.777] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.795 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0178473
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.802 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.803 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.822 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51376 [23/Feb/2026:09:40:42.802] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.822 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0192134
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.829 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.829 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51388 [23/Feb/2026:09:40:42.828] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.846 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0165329
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.860 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.860 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.874 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51394 [23/Feb/2026:09:40:42.859] listener listener/metadata 0/0/0/15/15 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.875 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0147550
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.880 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.880 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.896 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.896 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0154855
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51402 [23/Feb/2026:09:40:42.879] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.901 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.902 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.917 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51416 [23/Feb/2026:09:40:42.901] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.918 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0157170
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.923 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.924 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.942 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51432 [23/Feb/2026:09:40:42.922] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.943 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0193837
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.950 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.951 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.963 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51436 [23/Feb/2026:09:40:42.950] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.963 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0125294
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.970 163670 DEBUG eventlet.wsgi.server [-] (163670) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.971 163670 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Accept: */*
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Connection: close
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Content-Type: text/plain
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: Host: 169.254.169.254
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: User-Agent: curl/7.84.0
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Forwarded-For: 192.168.0.12
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: X-Ovn-Network-Id: 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.992 163670 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 23 09:40:42 np0005626463.localdomain haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[285197]: 192.168.0.12:51442 [23/Feb/2026:09:40:42.970] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Feb 23 09:40:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:42.993 163670 INFO eventlet.wsgi.server [-] 192.168.0.12,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0222180
Feb 23 09:40:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:43.036 282211 DEBUG nova.compute.manager [None req-90ff1dcc-9d21-4e9f-8c83-edee2f85df28 cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 23 09:40:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:43.041 282211 INFO nova.compute.manager [None req-90ff1dcc-9d21-4e9f-8c83-edee2f85df28 cb6895487918456aa599ca2f76872d00 37b8098efb0d4ecc90b451a2db0e966f - - default default] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Retrieving diagnostics
Feb 23 09:40:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:43.100 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:40:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:40:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:40:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:40:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:40:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:40:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:40:44 np0005626463.localdomain podman[285297]: 2026-02-23 09:40:44.90605501 +0000 UTC m=+0.082854282 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:40:44 np0005626463.localdomain podman[285297]: 2026-02-23 09:40:44.921226771 +0000 UTC m=+0.098026023 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 09:40:44 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:40:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:47.409 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:48.101 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:48.543 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:40:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:48.544 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:40:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:40:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:40:50 np0005626463.localdomain sshd[285316]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:40:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:40:50 np0005626463.localdomain systemd[1]: tmp-crun.2GKqCV.mount: Deactivated successfully.
Feb 23 09:40:50 np0005626463.localdomain podman[285318]: 2026-02-23 09:40:50.916210575 +0000 UTC m=+0.085230607 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:40:50 np0005626463.localdomain podman[285318]: 2026-02-23 09:40:50.925215104 +0000 UTC m=+0.094235176 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:40:50 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:40:51 np0005626463.localdomain sshd[285316]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:40:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:52.433 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:53.103 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:53 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:40:53Z|00073|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 23 09:40:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8686 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEEBBE0000000001030307) 
Feb 23 09:40:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27322 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEEE060000000001030307) 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.132 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.133 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.152 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 9140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a520f3ef-28aa-44d2-9bcf-e83bcaebf3d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9140000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:40:56.133495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c08b487c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.341516923, 'message_signature': 'e7b6908bbec05c61228310b062f93540e2c054fabfc292bd9ee3184adb8d2b16'}]}, 'timestamp': '2026-02-23 09:40:56.152763', '_unique_id': 'b5c8e1049b2d4253a97b16abb6469168'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.154 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.158 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d7a1401-3c4f-48ff-844f-838114e2bec1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.155617', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c08c46c8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'bdd6602598f626473e30ea136f4ac0ff2c6693a0fea535039426fdf72ace8952'}]}, 'timestamp': '2026-02-23 09:40:56.159247', '_unique_id': '7a2b765d839e40158491f281b5a36f9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.160 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.172 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '688e453e-5bf4-4b75-a481-6c3d4c1c516e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.161494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c08e7128-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': 'a11b995935708d2e5b39aa87d0de3caaf137b5a4c6514cadf25700c698aefbdb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.161494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c08e81d6-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': 'a43ac5f2d0fe66de82c7ecd48f45f0bbecb79b9aa15c0c7cb1dfa4e23bb042f2'}]}, 'timestamp': '2026-02-23 09:40:56.173817', '_unique_id': '5f756731789443c0922056df8d584ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.174 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea1ea470-e031-4d00-9307-5ba487a4a0fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.176099', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c08eebb2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'c2cf5a780cd0f819bceacfe8b14816e9fdd5e3329785850262a787ec95e06407'}]}, 'timestamp': '2026-02-23 09:40:56.176556', '_unique_id': 'aedaa2bf92a64843a4e7083614051493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.177 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.178 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80a2d8e3-34c9-4a40-bd29-364a2a7df099', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.178696', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c08f5246-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'd3562d44cb53f29865d6ae537074b8e3d01b49d682720ac975140a29467ce8c9'}]}, 'timestamp': '2026-02-23 09:40:56.179182', '_unique_id': 'f1fbd55ccdb74ff4b3f84d606802882b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac7e85f4-840a-4600-864b-6ebc97f5c95a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.181296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c093c952-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'cac7774751ceba90da55ea7dec2a4ac8c283effafb37c7eb535f4a46256b7ddc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.181296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c093e202-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '0be314ad6587ae582f2210fcf8bcbcb256ad242f2554d455bcb3d6e0718926e4'}]}, 'timestamp': '2026-02-23 09:40:56.209085', '_unique_id': 'f90cb3089d334f09941955efddb1307c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.211 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.212 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c898e5b-34b1-47f6-bf28-4fe1203d46fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.211592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c0945610-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '5606a725295c9a4bb793862be25b7b2f93ecc9865ad4bd08aeda41ad8a1912e6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.211592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c094679a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'bc91277163aa89329328c38452ae97a193b75118e319d82a7c88509675da91f9'}]}, 'timestamp': '2026-02-23 09:40:56.212467', '_unique_id': 'ebed4f9e933b4acd893049244d1b7f0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4990f2cf-108c-4c78-90ae-7f58fa94dbb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.214739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c094d248-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': '6957b7791008f212ce889676c6f2fe916fa42e636f921039b8cac9e09b547bf3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.214739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c094e260-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': '8055b70aede0f210c899d0d66253380532c216b70e56b5d5a1d49be9cc064902'}]}, 'timestamp': '2026-02-23 09:40:56.215606', '_unique_id': '944369e32b1d47b48a3acc8f844eb2e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cebc0921-5feb-4cd4-aed2-28895b021db0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.217736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c095478c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': '856396750a1a0ac5837eb1e695506810f2a256f50b717aa2e0cfa53333d06767'}]}, 'timestamp': '2026-02-23 09:40:56.218228', '_unique_id': '41954ab459f544c2bb24ed7c23d488de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cf70b84-0d25-40cc-a2e1-767d232a33f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.220331', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c095ab14-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'b421791dc9e4f86b885df56545d949eb2de8464c7e315d96972ee53d5ca8d4df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.220331', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c095bbfe-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '2e3b401aadb5d51a46f279cd48696b39f455bfb4bbaffb0e2656bcd8cedeffd3'}]}, 'timestamp': '2026-02-23 09:40:56.221180', '_unique_id': '15df9b7b7ec84924bf4d61f764fd4ef4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d8aadcf-5c58-4594-9229-35000817e967', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.223353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c0962166-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'b36263fb5c22f57bc914cb549417d4a65882ac69217abfbd62d78c2cdcfc0d26'}]}, 'timestamp': '2026-02-23 09:40:56.223804', '_unique_id': 'fc7e87f94e4b401cb2f01898663d0240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.225 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.225 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20092932-e698-428e-a497-6cff72e436e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.226206', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c0969088-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'cc6af74ae1fd3fd45d98f24327f045bf731723829ecc890e666ec6db124dd8cb'}]}, 'timestamp': '2026-02-23 09:40:56.226647', '_unique_id': '13aeaa4da52848f589db142a73120565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e4ab703-5ac1-4dcd-b0c2-7228a0c5217e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.228730', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c096f4b0-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'f0028b4e9dd574f2896263fb2160114f895fab229cab8f64c27f0d169acbf535'}]}, 'timestamp': '2026-02-23 09:40:56.229213', '_unique_id': '4dc50d6beab24bce97aaba28f33d51c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d56d2e5-eea2-4003-a863-f2bfca6dfef8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.734375, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:40:56.231277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c09756bc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.341516923, 'message_signature': 'fec2a6bafcfd1b5379571c9908f27814283140f1181877bfe1eaf2674ec29e0d'}]}, 'timestamp': '2026-02-23 09:40:56.231707', '_unique_id': '902ff22a81264d3a9e28345f985eed6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a0e6a10-f0a2-454f-a1af-a47e5bc8986c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.233811', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c097bc6a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'ba5c2ba2edfab4aecbc831ecf2bccf85e37ac6efebdf8507e1afdcf3f316dc4a'}]}, 'timestamp': '2026-02-23 09:40:56.234327', '_unique_id': '4cd077cef6b84cdd8f3aa8e1a58ad03b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 323584 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08cf4ca9-df5f-4a6e-ad05-242bca6e6857', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 323584, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.236381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c0981e1c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'a0776e00ce98e0421ce18762e4ff7dc56fcd743c3c3bd589ff8d2649997bdeeb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.236381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c0982f38-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '7225d1b24e4e5ec6464b2c535e5c881f09d070da10d633cc89c2d3f261d4255c'}]}, 'timestamp': '2026-02-23 09:40:56.237261', '_unique_id': 'fcf7ae545aee450f86e409101092f8d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59734280-4fa8-4fa8-ab1c-2ce97173a4ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.239364', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c09892d4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': 'e3224dde3fa9ddb483def28f241ee3e92052cd191f93fc3d2d40c05ae1847de9'}]}, 'timestamp': '2026-02-23 09:40:56.239813', '_unique_id': 'fd9b11dfc10e42448612de2a59fe2f04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d8670c2-a117-464c-be68-2f4ff34912c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 34, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.241913', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c098f63e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'fec79754a6d057997af6c21276d6e201c3b67093111aecc7a7d14583850b6887'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.241913', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c099062e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '59092a01fad7ed81ee98034f87ad798625712c72516beacaf427eb82324a0a99'}]}, 'timestamp': '2026-02-23 09:40:56.242736', '_unique_id': 'fc5005dbdb5640c9b80222fa0046eacc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2be1b2f4-2352-4c10-9723-5aad99780166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.244843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c09969e8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': '5a333bf925ae7e80fc5a4e6acf58851d2ccd76e121b2b01404de758b6ee542f4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.244843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c0997ac8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.350976876, 'message_signature': 'd7a9a12a43f7d0732ff1c02eee3a48f6c3c9535f995ba3e50bf5cc8b7f480c96'}]}, 'timestamp': '2026-02-23 09:40:56.245740', '_unique_id': 'dff6aa917a034a26a5b1d69852d3171e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e74fdf3-6b6f-4a6b-8a8d-1937b620ec1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:40:56.248186', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'c099ec06-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.345105514, 'message_signature': '5d590aacedde745bad049ad14394510b5c141a0a9d05fc587a36d24557d65e85'}]}, 'timestamp': '2026-02-23 09:40:56.248665', '_unique_id': '7c63e075a8ab44f9b2c62055d01a97e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 698220374 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90c80e91-251d-4721-b629-a6314bb03116', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 698220374, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:40:56.250291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c09a3a58-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': 'e94c2778a868870cb9e853be46e948de46d0f580e8c0303e25f3802c61cb1d12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:40:56.250291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c09a4444-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11096.3707775, 'message_signature': '5f9c42a976f8da7a6a4ff739dce8627df2f5cbf56503b9f0f6511978607814d7'}]}, 'timestamp': '2026-02-23 09:40:56.250805', '_unique_id': '6f4f80db7c614762bdefe12396e5dd4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:40:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:40:56.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:40:56 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8687 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEEFC60000000001030307) 
Feb 23 09:40:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:57.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29807 DF PROTO=TCP SPT=59864 DPT=9102 SEQ=1035056940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEF6060000000001030307) 
Feb 23 09:40:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:40:58.106 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:40:58 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8688 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFEF7C60000000001030307) 
Feb 23 09:40:58 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:40:58 np0005626463.localdomain podman[285337]: 2026-02-23 09:40:58.918886738 +0000 UTC m=+0.083275965 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:40:58 np0005626463.localdomain podman[285337]: 2026-02-23 09:40:58.926379281 +0000 UTC m=+0.090768438 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:40:58 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:40:59 np0005626463.localdomain sshd[285360]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:41:00 np0005626463.localdomain sshd[285360]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:41:02 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8689 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF07860000000001030307) 
Feb 23 09:41:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:02.502 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:03.107 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:03 np0005626463.localdomain rsyslogd[758]: imjournal: 8899 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 23 09:41:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:41:06 np0005626463.localdomain systemd[1]: tmp-crun.aLEaEc.mount: Deactivated successfully.
Feb 23 09:41:06 np0005626463.localdomain podman[285362]: 2026-02-23 09:41:06.903921966 +0000 UTC m=+0.080223461 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.)
Feb 23 09:41:06 np0005626463.localdomain podman[285362]: 2026-02-23 09:41:06.94531229 +0000 UTC m=+0.121613725 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible)
Feb 23 09:41:06 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:41:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:07.540 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:08.110 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:41:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:41:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:41:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1"
Feb 23 09:41:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:41:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16796 "" "Go-http-client/1.1"
Feb 23 09:41:10 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8690 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF28060000000001030307) 
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.258 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.259 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:41:12 np0005626463.localdomain sudo[285382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:41:12 np0005626463.localdomain sudo[285382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:41:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:41:12 np0005626463.localdomain sudo[285382]: pam_unix(sudo:session): session closed for user root
Feb 23 09:41:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:41:12 np0005626463.localdomain sudo[285412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:41:12 np0005626463.localdomain sudo[285412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:12 np0005626463.localdomain podman[285401]: 2026-02-23 09:41:12.610280432 +0000 UTC m=+0.125086834 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.612 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:41:12 np0005626463.localdomain podman[285401]: 2026-02-23 09:41:12.622250863 +0000 UTC m=+0.137057275 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:41:12 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:41:12 np0005626463.localdomain podman[285399]: 2026-02-23 09:41:12.683341009 +0000 UTC m=+0.201224276 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:41:12 np0005626463.localdomain podman[285399]: 2026-02-23 09:41:12.716086795 +0000 UTC m=+0.233970092 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:41:12 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.961 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.989 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.989 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.990 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.990 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.992 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:41:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:12.992 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.009 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.010 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.010 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.011 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.011 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.111 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:13 np0005626463.localdomain sudo[285412]: pam_unix(sudo:session): session closed for user root
Feb 23 09:41:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:41:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:41:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:41:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:41:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:41:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.472 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.538 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.539 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:41:13 np0005626463.localdomain sudo[285518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:41:13 np0005626463.localdomain sudo[285518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.778 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.781 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12330MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:41:13 np0005626463.localdomain sudo[285518]: pam_unix(sudo:session): session closed for user root
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.781 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.881 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.881 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.882 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:41:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:13.937 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:41:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:14.396 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:41:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:14.404 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:41:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:14.426 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:41:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:14.458 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:41:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:14.459 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:41:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:41:15 np0005626463.localdomain podman[285558]: 2026-02-23 09:41:15.908650516 +0000 UTC m=+0.082119479 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:41:15 np0005626463.localdomain podman[285558]: 2026-02-23 09:41:15.918533062 +0000 UTC m=+0.092002055 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 23 09:41:15 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:41:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:17.626 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:18.115 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:41:21 np0005626463.localdomain podman[285578]: 2026-02-23 09:41:21.905425167 +0000 UTC m=+0.081640116 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 09:41:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 09:41:21 np0005626463.localdomain snmpd[67690]: empty variable list in _query
Feb 23 09:41:21 np0005626463.localdomain podman[285578]: 2026-02-23 09:41:21.914309232 +0000 UTC m=+0.090524171 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 23 09:41:21 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:41:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:22.651 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:23.117 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50868 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF5BFF0000000001030307) 
Feb 23 09:41:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50869 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF60060000000001030307) 
Feb 23 09:41:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8691 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF68060000000001030307) 
Feb 23 09:41:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50870 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF68060000000001030307) 
Feb 23 09:41:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:27.693 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:28.118 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:28 np0005626463.localdomain sshd[285597]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:41:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27323 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF6C060000000001030307) 
Feb 23 09:41:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:41:29 np0005626463.localdomain podman[285599]: 2026-02-23 09:41:29.91161492 +0000 UTC m=+0.086209217 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:41:29 np0005626463.localdomain podman[285599]: 2026-02-23 09:41:29.928239975 +0000 UTC m=+0.102834262 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:41:29 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:41:30 np0005626463.localdomain sshd[285597]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:41:31 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50871 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF77C60000000001030307) 
Feb 23 09:41:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:32.731 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:33.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:37.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:41:37 np0005626463.localdomain podman[285622]: 2026-02-23 09:41:37.901147904 +0000 UTC m=+0.077466665 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 09:41:37 np0005626463.localdomain podman[285622]: 2026-02-23 09:41:37.91615287 +0000 UTC m=+0.092471641 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, release=1770267347)
Feb 23 09:41:37 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:41:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:38.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:38 np0005626463.localdomain sshd[285643]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:41:38 np0005626463.localdomain sshd[285643]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:41:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:41:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:41:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:41:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1"
Feb 23 09:41:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:41:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16795 "" "Go-http-client/1.1"
Feb 23 09:41:39 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50872 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF98070000000001030307) 
Feb 23 09:41:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:42.790 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:41:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:41:42 np0005626463.localdomain podman[285645]: 2026-02-23 09:41:42.910761288 +0000 UTC m=+0.080973034 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:41:42 np0005626463.localdomain podman[285646]: 2026-02-23 09:41:42.961553084 +0000 UTC m=+0.127934041 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:41:42 np0005626463.localdomain podman[285645]: 2026-02-23 09:41:42.974289719 +0000 UTC m=+0.144501445 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 09:41:42 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:41:43 np0005626463.localdomain podman[285646]: 2026-02-23 09:41:43.025107357 +0000 UTC m=+0.191488304 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:41:43 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:41:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:43.124 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:41:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:41:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:41:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:41:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:41:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:41:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:41:46 np0005626463.localdomain podman[285693]: 2026-02-23 09:41:46.903719288 +0000 UTC m=+0.080052095 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:41:46 np0005626463.localdomain podman[285693]: 2026-02-23 09:41:46.943254145 +0000 UTC m=+0.119587002 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 23 09:41:46 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:41:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:47.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:48.126 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:41:48.544 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:41:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:41:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:41:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:41:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:41:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:41:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:52.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:52 np0005626463.localdomain podman[285712]: 2026-02-23 09:41:52.910041253 +0000 UTC m=+0.085008819 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent)
Feb 23 09:41:52 np0005626463.localdomain podman[285712]: 2026-02-23 09:41:52.94376542 +0000 UTC m=+0.118732976 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:41:52 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:41:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:53.129 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:53 np0005626463.localdomain sshd[285730]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:41:53 np0005626463.localdomain sshd[285730]: Accepted publickey for zuul from 38.102.83.114 port 32876 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:41:53 np0005626463.localdomain systemd-logind[759]: New session 61 of user zuul.
Feb 23 09:41:53 np0005626463.localdomain systemd[1]: Started Session 61 of User zuul.
Feb 23 09:41:53 np0005626463.localdomain sshd[285730]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 09:41:54 np0005626463.localdomain sudo[285750]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnrpmylxfwdhvlhluuhvevyzwxvewkju ; /usr/bin/python3
Feb 23 09:41:54 np0005626463.localdomain sudo[285750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 09:41:54 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16173 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD1310000000001030307) 
Feb 23 09:41:54 np0005626463.localdomain python3[285752]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:41:54 np0005626463.localdomain subscription-manager[285753]: Unregistered machine with identity: 71d8a449-76d3-4525-90bb-1ec088bb454f
Feb 23 09:41:54 np0005626463.localdomain systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Feb 23 09:41:54 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:41:54 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:41:54 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:41:54 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:41:54 np0005626463.localdomain sudo[285750]: pam_unix(sudo:session): session closed for user root
Feb 23 09:41:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16174 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD5470000000001030307) 
Feb 23 09:41:55 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50873 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD8060000000001030307) 
Feb 23 09:41:57 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16175 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFDD470000000001030307) 
Feb 23 09:41:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:57.840 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:41:58.130 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:41:59 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8692 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFE6060000000001030307) 
Feb 23 09:42:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:42:00 np0005626463.localdomain podman[285756]: 2026-02-23 09:42:00.909545969 +0000 UTC m=+0.083077650 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:42:00 np0005626463.localdomain podman[285756]: 2026-02-23 09:42:00.917554398 +0000 UTC m=+0.091086109 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:42:00 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:42:01 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16176 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFED070000000001030307) 
Feb 23 09:42:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:02.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:03.131 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:03 np0005626463.localdomain sshd[285779]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:42:03 np0005626463.localdomain sshd[285779]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:42:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:07.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:08.132 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:42:08 np0005626463.localdomain podman[285781]: 2026-02-23 09:42:08.900938167 +0000 UTC m=+0.076790338 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, distribution-scope=public, version=9.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git)
Feb 23 09:42:08 np0005626463.localdomain podman[285781]: 2026-02-23 09:42:08.917294718 +0000 UTC m=+0.093146889 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., vcs-type=git, release=1770267347, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7)
Feb 23 09:42:08 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:42:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:42:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:42:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:42:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1"
Feb 23 09:42:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:42:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16796 "" "Go-http-client/1.1"
Feb 23 09:42:09 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16177 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C000E060000000001030307) 
Feb 23 09:42:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:12.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:13.134 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:42:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:42:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:42:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:42:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:42:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:42:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:42:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:42:13 np0005626463.localdomain podman[285802]: 2026-02-23 09:42:13.904213193 +0000 UTC m=+0.077735597 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:42:13 np0005626463.localdomain podman[285803]: 2026-02-23 09:42:13.981986571 +0000 UTC m=+0.151035036 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:42:13 np0005626463.localdomain podman[285802]: 2026-02-23 09:42:13.989378462 +0000 UTC m=+0.162900886 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:42:14 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:42:14 np0005626463.localdomain sudo[285837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:42:14 np0005626463.localdomain podman[285803]: 2026-02-23 09:42:14.042530272 +0000 UTC m=+0.211578757 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:42:14 np0005626463.localdomain sudo[285837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:14 np0005626463.localdomain sudo[285837]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:14 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:42:14 np0005626463.localdomain sudo[285869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:42:14 np0005626463.localdomain sudo[285869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.461 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.635 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:42:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:42:14 np0005626463.localdomain sudo[285869]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.015 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.059 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.060 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.063 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.063 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.099 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.100 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.100 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.101 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.101 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.636 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.702 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.703 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.916 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.918 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12328MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.919 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:42:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:15.919 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.018 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.051 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.507 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.513 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.535 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.537 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:42:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:16.538 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:42:17 np0005626463.localdomain sudo[285964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:17 np0005626463.localdomain sudo[285964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:42:17 np0005626463.localdomain sudo[285964]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:17 np0005626463.localdomain podman[285982]: 2026-02-23 09:42:17.528275176 +0000 UTC m=+0.077429519 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 09:42:17 np0005626463.localdomain podman[285982]: 2026-02-23 09:42:17.539836777 +0000 UTC m=+0.088991170 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 09:42:17 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:42:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:17.965 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:18.135 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:23.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:23.137 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:42:23 np0005626463.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 23 09:42:23 np0005626463.localdomain podman[286002]: 2026-02-23 09:42:23.360605442 +0000 UTC m=+0.094314416 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:42:23 np0005626463.localdomain podman[286002]: 2026-02-23 09:42:23.365302009 +0000 UTC m=+0.099011023 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:42:23 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:42:23 np0005626463.localdomain sudo[286020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:23 np0005626463.localdomain sudo[286020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:23 np0005626463.localdomain sudo[286020]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:24 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10004 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0046600000000001030307) 
Feb 23 09:42:25 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10005 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C004A860000000001030307) 
Feb 23 09:42:25 np0005626463.localdomain sudo[286038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:25 np0005626463.localdomain sudo[286038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:25 np0005626463.localdomain sudo[286038]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:26 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16178 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C004E060000000001030307) 
Feb 23 09:42:26 np0005626463.localdomain sudo[286056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:26 np0005626463.localdomain sudo[286056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:26 np0005626463.localdomain sudo[286056]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:26 np0005626463.localdomain sshd[286074]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:42:26 np0005626463.localdomain sshd[286074]: Accepted publickey for tripleo-admin from 192.168.122.11 port 48530 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:42:26 np0005626463.localdomain systemd-logind[759]: New session 62 of user tripleo-admin.
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Queued start job for default target Main User Target.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Created slice User Application Slice.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Reached target Paths.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Reached target Timers.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Starting D-Bus User Message Bus Socket...
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Starting Create User's Volatile Files and Directories...
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Listening on D-Bus User Message Bus Socket.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Reached target Sockets.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Finished Create User's Volatile Files and Directories.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Reached target Basic System.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Reached target Main User Target.
Feb 23 09:42:26 np0005626463.localdomain systemd[286078]: Startup finished in 150ms.
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 23 09:42:26 np0005626463.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Feb 23 09:42:26 np0005626463.localdomain sshd[286074]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 09:42:27 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10006 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0052860000000001030307) 
Feb 23 09:42:27 np0005626463.localdomain sudo[286220]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzhlvbmlastzvreccfkoprrusutkqamr ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771839747.0022876-62308-111228369076647/AnsiballZ_blockinfile.py
Feb 23 09:42:27 np0005626463.localdomain sudo[286220]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 09:42:27 np0005626463.localdomain python3[286222]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:42:27 np0005626463.localdomain sudo[286220]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:28 np0005626463.localdomain sshd[286278]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:42:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:28.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:28 np0005626463.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50874 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0056070000000001030307) 
Feb 23 09:42:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:28.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:28 np0005626463.localdomain sshd[286278]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:42:28 np0005626463.localdomain sudo[286366]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srapwekdkaxuqgaaqqpayodmtqdsbewv ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771839747.7931294-62322-26786782706662/AnsiballZ_systemd.py
Feb 23 09:42:28 np0005626463.localdomain sudo[286366]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 09:42:28 np0005626463.localdomain python3[286368]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 23 09:42:28 np0005626463.localdomain systemd[1]: Stopping Netfilter Tables...
Feb 23 09:42:28 np0005626463.localdomain systemd[1]: nftables.service: Deactivated successfully.
Feb 23 09:42:28 np0005626463.localdomain systemd[1]: Stopped Netfilter Tables.
Feb 23 09:42:28 np0005626463.localdomain systemd[1]: Starting Netfilter Tables...
Feb 23 09:42:28 np0005626463.localdomain systemd[1]: Finished Netfilter Tables.
Feb 23 09:42:28 np0005626463.localdomain sudo[286366]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:42:31 np0005626463.localdomain systemd[1]: tmp-crun.2S977Q.mount: Deactivated successfully.
Feb 23 09:42:31 np0005626463.localdomain podman[286393]: 2026-02-23 09:42:31.923484349 +0000 UTC m=+0.095956217 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:42:31 np0005626463.localdomain podman[286393]: 2026-02-23 09:42:31.959290446 +0000 UTC m=+0.131762304 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:42:31 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:42:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:33.087 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:33.139 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:35 np0005626463.localdomain sudo[286417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:35 np0005626463.localdomain sudo[286417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:35 np0005626463.localdomain sudo[286417]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:36 np0005626463.localdomain sshd[286435]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:42:36 np0005626463.localdomain sudo[286437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:36 np0005626463.localdomain sudo[286437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:36 np0005626463.localdomain sudo[286437]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:37 np0005626463.localdomain sshd[286435]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:42:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:38.128 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:38.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:38 np0005626463.localdomain sudo[286455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:38 np0005626463.localdomain sudo[286455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:38 np0005626463.localdomain sudo[286455]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:39 np0005626463.localdomain sudo[286473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:39 np0005626463.localdomain sudo[286473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:42:39 np0005626463.localdomain sudo[286473]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:39 np0005626463.localdomain podman[286491]: 2026-02-23 09:42:39.352442018 +0000 UTC m=+0.081263298 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z)
Feb 23 09:42:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:42:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:42:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:42:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1"
Feb 23 09:42:39 np0005626463.localdomain podman[286491]: 2026-02-23 09:42:39.447179166 +0000 UTC m=+0.176000436 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=)
Feb 23 09:42:39 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:42:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:42:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16804 "" "Go-http-client/1.1"
Feb 23 09:42:40 np0005626463.localdomain sudo[286511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:40 np0005626463.localdomain sudo[286511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:40 np0005626463.localdomain sudo[286511]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:42 np0005626463.localdomain sudo[286529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:42 np0005626463.localdomain sudo[286529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:42 np0005626463.localdomain sudo[286529]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.142 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.143 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.144 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.144 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:43.158 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:42:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:42:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:42:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:42:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:42:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:42:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:42:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:42:44 np0005626463.localdomain systemd[1]: tmp-crun.rmFztn.mount: Deactivated successfully.
Feb 23 09:42:44 np0005626463.localdomain podman[286548]: 2026-02-23 09:42:44.905634543 +0000 UTC m=+0.076230051 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:42:44 np0005626463.localdomain podman[286547]: 2026-02-23 09:42:44.878732583 +0000 UTC m=+0.056661690 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:42:44 np0005626463.localdomain podman[286548]: 2026-02-23 09:42:44.940170881 +0000 UTC m=+0.110766369 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:42:44 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:42:44 np0005626463.localdomain podman[286547]: 2026-02-23 09:42:44.962327293 +0000 UTC m=+0.140256410 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible)
Feb 23 09:42:44 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:42:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:42:47 np0005626463.localdomain sudo[286595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:42:47 np0005626463.localdomain sudo[286595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:47 np0005626463.localdomain sudo[286595]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:47 np0005626463.localdomain podman[286600]: 2026-02-23 09:42:47.923014906 +0000 UTC m=+0.093448629 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:42:47 np0005626463.localdomain sudo[286624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:42:47 np0005626463.localdomain sudo[286624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:47 np0005626463.localdomain podman[286600]: 2026-02-23 09:42:47.962274572 +0000 UTC m=+0.132708305 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:42:47 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.194 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:48.195 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.51546771 +0000 UTC m=+0.072913577 container create 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, architecture=x86_64, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Feb 23 09:42:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:42:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:42:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:42:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:42:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:42:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope.
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.484994039 +0000 UTC m=+0.042439936 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.601497836 +0000 UTC m=+0.158943703 container init 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, name=rhceph, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public)
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.610460096 +0000 UTC m=+0.167905973 container start 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, io.buildah.version=1.42.2, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.610734064 +0000 UTC m=+0.168179971 container attach 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, architecture=x86_64, ceph=True, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Feb 23 09:42:48 np0005626463.localdomain elegant_spence[286707]: 167 167
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: libpod-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope: Deactivated successfully.
Feb 23 09:42:48 np0005626463.localdomain podman[286692]: 2026-02-23 09:42:48.615854484 +0000 UTC m=+0.173300351 container died 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 23 09:42:48 np0005626463.localdomain podman[286712]: 2026-02-23 09:42:48.707851616 +0000 UTC m=+0.080449642 container remove 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, release=1770267347, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=)
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: libpod-conmon-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope: Deactivated successfully.
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:42:48 np0005626463.localdomain systemd-sysv-generator[286758]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:42:48 np0005626463.localdomain systemd-rc-local-generator[286753]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:48 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-90b778e891eff2f457563f8bd9e27a554e43b9b8578aabf5f635bd483d3c32b5-merged.mount: Deactivated successfully.
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:42:49 np0005626463.localdomain systemd-rc-local-generator[286794]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:42:49 np0005626463.localdomain systemd-sysv-generator[286799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: Starting Ceph mds.mds.np0005626463.qcthuc for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 09:42:49 np0005626463.localdomain podman[286858]: 
Feb 23 09:42:49 np0005626463.localdomain podman[286858]: 2026-02-23 09:42:49.814592385 +0000 UTC m=+0.063851764 container create 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, release=1770267347, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: tmp-crun.jiUAtI.mount: Deactivated successfully.
Feb 23 09:42:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:42:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 09:42:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:42:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/lib/ceph/mds/ceph-mds.np0005626463.qcthuc supports timestamps until 2038 (0x7fffffff)
Feb 23 09:42:49 np0005626463.localdomain podman[286858]: 2026-02-23 09:42:49.880009307 +0000 UTC m=+0.129268686 container init 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:42:49 np0005626463.localdomain podman[286858]: 2026-02-23 09:42:49.783375731 +0000 UTC m=+0.032635160 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:42:49 np0005626463.localdomain podman[286858]: 2026-02-23 09:42:49.890474504 +0000 UTC m=+0.139733883 container start 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:42:49 np0005626463.localdomain bash[286858]: 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2
Feb 23 09:42:49 np0005626463.localdomain systemd[1]: Started Ceph mds.mds.np0005626463.qcthuc for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 09:42:49 np0005626463.localdomain sudo[286624]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:49 np0005626463.localdomain ceph-mds[286877]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 09:42:49 np0005626463.localdomain ceph-mds[286877]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mds, pid 2
Feb 23 09:42:49 np0005626463.localdomain ceph-mds[286877]: main not setting numa affinity
Feb 23 09:42:49 np0005626463.localdomain ceph-mds[286877]: pidfile_write: ignore empty --pid-file
Feb 23 09:42:49 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc[286873]: starting mds.mds.np0005626463.qcthuc at 
Feb 23 09:42:49 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 9 from mon.1
Feb 23 09:42:50 np0005626463.localdomain sudo[286896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:50 np0005626463.localdomain sudo[286896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:50 np0005626463.localdomain sudo[286896]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:50 np0005626463.localdomain sudo[286914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:42:50 np0005626463.localdomain sudo[286914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:50 np0005626463.localdomain sudo[286914]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:50 np0005626463.localdomain sudo[286932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:42:50 np0005626463.localdomain sudo[286932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:50 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 10 from mon.1
Feb 23 09:42:50 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Monitors have assigned me to become a standby.
Feb 23 09:42:51 np0005626463.localdomain sudo[286932]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:51 np0005626463.localdomain sudo[286971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:42:51 np0005626463.localdomain sudo[286971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:51 np0005626463.localdomain sudo[286971]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:51 np0005626463.localdomain sudo[286989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:42:51 np0005626463.localdomain sudo[286989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:52 np0005626463.localdomain podman[287077]: 2026-02-23 09:42:52.054568592 +0000 UTC m=+0.088890717 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc.)
Feb 23 09:42:52 np0005626463.localdomain podman[287077]: 2026-02-23 09:42:52.162322585 +0000 UTC m=+0.196644740 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:42:52 np0005626463.localdomain sudo[286989]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:52 np0005626463.localdomain sudo[287162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:52 np0005626463.localdomain sudo[287162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:52 np0005626463.localdomain sudo[287162]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:42:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:53.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:53 np0005626463.localdomain sudo[287180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:42:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:42:53 np0005626463.localdomain sudo[287180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:42:53 np0005626463.localdomain sudo[287180]: pam_unix(sudo:session): session closed for user root
Feb 23 09:42:53 np0005626463.localdomain podman[287197]: 2026-02-23 09:42:53.585378568 +0000 UTC m=+0.092301572 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:42:53 np0005626463.localdomain podman[287197]: 2026-02-23 09:42:53.615831979 +0000 UTC m=+0.122754973 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:42:53 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:42:54 np0005626463.localdomain sshd[285733]: Received disconnect from 38.102.83.114 port 32876:11: disconnected by user
Feb 23 09:42:54 np0005626463.localdomain sshd[285733]: Disconnected from user zuul 38.102.83.114 port 32876
Feb 23 09:42:54 np0005626463.localdomain sshd[285730]: pam_unix(sshd:session): session closed for user zuul
Feb 23 09:42:54 np0005626463.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Feb 23 09:42:54 np0005626463.localdomain systemd-logind[759]: Session 61 logged out. Waiting for processes to exit.
Feb 23 09:42:54 np0005626463.localdomain systemd-logind[759]: Removed session 61.
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.135 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.142 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aaf9f2e-74b2-4fb6-a1a5-a9a6d6f13a3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.136299', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081069de-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '5004d5749f8398215a45450249584f5b11d2cf334cfce85b07d22aa52eeeb41e'}]}, 'timestamp': '2026-02-23 09:42:56.143503', '_unique_id': '0a039fb2933c4daab25802401af3fada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.178 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.179 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8713182-ca09-4b43-ac43-ff2572069154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.146526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0815dd74-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'f8da7f2f74754f91027cbe6d8d9a4b7d11b4755f261fad52ba81689d46fa29af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.146526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0815eee0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '9e762d2506229680f6babf1359f5a19f5e41fd7db9e2f039fa245e68d359f464'}]}, 'timestamp': '2026-02-23 09:42:56.179582', '_unique_id': 'a0b0444ba2a74038a8c55bd33ecd1194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e606598-91e4-4330-b974-a327d77ee553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.182078', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '08166104-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '22c44ac3b6758376916dee428e35ffb14e91fcbf84e3b23578b550e9ccd625d4'}]}, 'timestamp': '2026-02-23 09:42:56.182531', '_unique_id': '905ca80190324939a3a2f41f350c54bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '996d5aae-ff70-4878-88bf-b81e7f64d9d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.184640', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0816c784-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '85ec92f7e1ac08582c6603c4165e9c0a97fbc4c6dccb2082acc0146d109248f7'}]}, 'timestamp': '2026-02-23 09:42:56.185159', '_unique_id': 'dded05689609423497f47511a26a6736'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 9700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d4e83ee-ca4b-4695-8951-0634518fe67c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9700000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:42:56.187252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '081a3eaa-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.396759027, 'message_signature': '84c1b1f10e26d9c2d7f324d48d3955e3f5d92dba676e3c76fdbd8f59efbee1fa'}]}, 'timestamp': '2026-02-23 09:42:56.207857', '_unique_id': 'c8e28c1c2e8b4ccf95ea6ab5e1aa2706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24981149-fc73-432d-b042-6bb0a0d3e4ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.210026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081c6716-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '1818b50183424b042bbc0399f0698642d97c77c12f37ce3824cbe391e85e3729'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.210026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081c795e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '8c96c267f3cfa2fbb40bc788cce16a437a1eedbf42b8de22cab9efdf63493183'}]}, 'timestamp': '2026-02-23 09:42:56.222446', '_unique_id': '0ce8cdaeacb047fda1e9f250543a2f76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45f95f31-45cf-45d1-82be-68773a8b9b6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.224911', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081cebdc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '3df5b2a51dee9eac3b75e48feddb3ccc1d82570e14b93a9c026a60ed9b14a868'}]}, 'timestamp': '2026-02-23 09:42:56.225411', '_unique_id': 'eab912c8af8948e498505754776f9906'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b17ad395-4a94-4f80-9692-174ef6dd8139', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.227643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081d5662-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '649054c31e892406dcd07576fd8662fb6735d82ab8e8f58df3db825d22a333dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.227643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081d66a2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'fa9b248c7baaf607629b0d15a9ffafd526ad18c313951f52e1f12efb2817b370'}]}, 'timestamp': '2026-02-23 09:42:56.228517', '_unique_id': 'd33283c0969f46b198a466d95849daae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfdc84b7-e5fb-42e3-a010-403818eaf69f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:42:56.230659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '081dcc0a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.396759027, 'message_signature': '75243a3bb81d2850059da707eea61dd7406b03c34ac9a7b760f00c41027de211'}]}, 'timestamp': '2026-02-23 09:42:56.231128', '_unique_id': '5eccb86279004ea6b41fe1d4ebd3d321'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6564a07a-c4ee-418d-9fec-7a2da5189e32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.233188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081e2fd8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'a409e70945339c934059cc9b4e4ab685a03e4d2d4ced2913f07c1e1a8d1a17a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.233188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081e40f4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '1aa00b07a24c31ea886ebca43bfed9976d7a5e58f3185f6d0a348c3f514910ce'}]}, 'timestamp': '2026-02-23 09:42:56.234110', '_unique_id': 'bef1a1d958944020aee2777643b28a81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3105c122-33fb-4c3c-b6fa-45a76a662d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.236223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081ea3dc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'f16c11c1bcf09df6d13569d36d5cfeef06740c708a2348b39be2f547c186524a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.236223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081eb39a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'e47025324de5ea8066e8ed106bcb86526eba788bacc773e7bef958594f9b63db'}]}, 'timestamp': '2026-02-23 09:42:56.237076', '_unique_id': 'f852bd1bef3340b8aa8c0fde8d3e99d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35b31cfd-049d-48c6-a268-e95fcf7f7366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.239207', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081f18a8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '94b95e523751144288d6dfb24687e792d154330471d92ff1c81989d6dbaea508'}]}, 'timestamp': '2026-02-23 09:42:56.239656', '_unique_id': '9d8da660cbd74c5dbc2763325accba47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2974971b-ae44-4118-bd0e-a4fb2cb025c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.241744', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081f7d16-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '837af4ea1562a7e1fbb3ce76ec9a0c79a6556e9a8847261605f6c3cd7cf097c4'}]}, 'timestamp': '2026-02-23 09:42:56.242225', '_unique_id': 'e4998c743ce84d5bb1b22153e6ea1ea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21d9aa53-0676-4c64-82e6-11d4b14804c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.244410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081fe36e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '6de8348f7f9f15e59e7f8127929fddf2e29c6b0c821a7a2877c02664f5ccb9a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.244410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081ff53e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'd8cbab1a69c8b5e42d30545f7fc0c524eaa12f3015be398a30cae9958de8361b'}]}, 'timestamp': '2026-02-23 09:42:56.245276', '_unique_id': '364720da7ce44f13a2269ca3ae454997'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1dc19f0-7ef1-4064-a7a2-2a60445bab81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.247399', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0820589e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '5068d327d66850cf633d7ba5f87df6c40721509e17397978e662343c06074ede'}]}, 'timestamp': '2026-02-23 09:42:56.247845', '_unique_id': '46178c2f09334a138c88234d42441874'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e2e6ae7-fae5-4daa-a07b-436c31a6c457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.249960', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0820bca8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '0a1cd6259351da43a6defe76bbc4ae0042b87ce4d7d4ab81a7871884b1f22d34'}]}, 'timestamp': '2026-02-23 09:42:56.250407', '_unique_id': 'bc1da6c52bdf4acc83506dd3023e049a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e0c9629-60e6-42e1-9753-4e3159bb4172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.252455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '08211dd8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '241d8b71d35dbb5dd35066193dc97e815cb86f5bab3686700e70602d9cfa42bf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.252455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '08212f44-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '515cf0fc49c964b1f9bd4a4df54ade23cbecec062ac58368d233917a1a985adf'}]}, 'timestamp': '2026-02-23 09:42:56.253317', '_unique_id': '7a86fbc30d444bdd8f2184c4e71ab51d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db47753a-1dbb-4f2c-8a4e-384cc5f22849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.255443', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '082192c2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': 'aba981380a9062cb80cf93f22ce63c5582bab10078967c8fb4bba8ef89405724'}]}, 'timestamp': '2026-02-23 09:42:56.255924', '_unique_id': '3eeabad324444401bfbb2b2d54a83a51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972313fa-3958-4b5d-b986-8e6a5ea34419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.257995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0821f64a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'd8938349b17c793e71986b3336ca0d6923875617edf70b1dfa1f5d04b1f8f5b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.257995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0822059a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '796da8ad8633975a2e8ab53b5d949220e18792f45f3caae462985880ca30a234'}]}, 'timestamp': '2026-02-23 09:42:56.258801', '_unique_id': 'b1ca91d681c84d3c8c3669086444c1ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60b5a15a-eacb-4de9-b6b4-d260e310abed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.260960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '08226a26-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '5ac16555c8ca7d90c6f9f6d2aa9f5ff4aaffdc3c5b449a5ff05dfb901e1711df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.260960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '08227ac0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': 'bef25ac0dc028d86cd061b5992586371595f07f9ed684d6ffabafb59792f97aa'}]}, 'timestamp': '2026-02-23 09:42:56.261801', '_unique_id': '194ec8f5545149678ce65ba0f0f79b81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4dc75e9-4a64-4933-9f88-dfa4075b36e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.263333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0822c3e0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '4e178897efedb771b8143adb00d239491f8f95092d8534012de6f787848e9975'}]}, 'timestamp': '2026-02-23 09:42:56.263613', '_unique_id': '2a8c24e048fa4ff9ba82b798a2758c7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:42:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.241 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:42:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:42:58.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:43:02 np0005626463.localdomain podman[287217]: 2026-02-23 09:43:02.902290362 +0000 UTC m=+0.076806418 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:43:02 np0005626463.localdomain podman[287217]: 2026-02-23 09:43:02.915263878 +0000 UTC m=+0.089779914 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:43:02 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.279 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:03.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:07 np0005626463.localdomain sshd[287238]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.283 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.314 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:08.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:08 np0005626463.localdomain sshd[287238]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:43:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:43:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:43:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:43:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1"
Feb 23 09:43:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:43:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17288 "" "Go-http-client/1.1"
Feb 23 09:43:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:43:09 np0005626463.localdomain podman[287240]: 2026-02-23 09:43:09.905424699 +0000 UTC m=+0.078625246 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:43:09 np0005626463.localdomain podman[287240]: 2026-02-23 09:43:09.914508142 +0000 UTC m=+0.087708739 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:43:09 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.126 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.145 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.677 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:43:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.058 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.079 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.082 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.082 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.083 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.109 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.110 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.110 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.111 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.111 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:43:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:43:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:43:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:43:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:43:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.571 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.632 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.632 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.857 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12305MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.860 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.931 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:43:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:13.968 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:43:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:14.422 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:43:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:14.429 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:43:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:14.445 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:43:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:14.449 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:43:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:14.450 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:43:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:43:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:43:15 np0005626463.localdomain podman[287304]: 2026-02-23 09:43:15.969647886 +0000 UTC m=+0.144920636 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 23 09:43:15 np0005626463.localdomain podman[287304]: 2026-02-23 09:43:15.999357713 +0000 UTC m=+0.174630493 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Feb 23 09:43:16 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:43:16 np0005626463.localdomain podman[287305]: 2026-02-23 09:43:15.950692934 +0000 UTC m=+0.121638019 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:43:16 np0005626463.localdomain podman[287305]: 2026-02-23 09:43:16.083258332 +0000 UTC m=+0.254203467 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:43:16 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:43:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:16.375 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:16.376 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:43:16 np0005626463.localdomain sudo[287352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:43:16 np0005626463.localdomain sudo[287352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:16 np0005626463.localdomain sudo[287352]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:16 np0005626463.localdomain sudo[287370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:43:16 np0005626463.localdomain sudo[287370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:17 np0005626463.localdomain sudo[287370]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.339 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.360 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5022 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.362 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:18.363 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:18 np0005626463.localdomain sshd[287420]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:43:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:43:18 np0005626463.localdomain systemd[1]: tmp-crun.RHNRpw.mount: Deactivated successfully.
Feb 23 09:43:18 np0005626463.localdomain podman[287422]: 2026-02-23 09:43:18.913813653 +0000 UTC m=+0.087387809 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:43:18 np0005626463.localdomain podman[287422]: 2026-02-23 09:43:18.929292777 +0000 UTC m=+0.102866923 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:43:18 np0005626463.localdomain sshd[287420]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:43:18 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:43:22 np0005626463.localdomain sudo[287441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:43:22 np0005626463.localdomain sudo[287441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:22 np0005626463.localdomain sudo[287441]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.363 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5025 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.389 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.390 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:23.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:43:23 np0005626463.localdomain podman[287459]: 2026-02-23 09:43:23.908136901 +0000 UTC m=+0.081378271 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:43:23 np0005626463.localdomain podman[287459]: 2026-02-23 09:43:23.943195136 +0000 UTC m=+0.116436466 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 23 09:43:23 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 13 from mon.1
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map state change up:standby --> up:replay
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13 replay_start
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13  waiting for osdmap 79 (which blocklists prior instance)
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.cache creating system inode with ino:0x100
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.cache creating system inode with ino:0x1
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13 Finished replaying journal
Feb 23 09:43:26 np0005626463.localdomain ceph-mds[286877]: mds.0.13 making mds journal writeable
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 14 from mon.1
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map state change up:replay --> up:reconnect
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.0.13 reconnect_start
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.0.13 reopen_log
Feb 23 09:43:27 np0005626463.localdomain ceph-mds[286877]: mds.0.13 reconnect_done
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.394 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:28.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:28 np0005626463.localdomain sshd[286095]: Received disconnect from 192.168.122.11 port 48530:11: disconnected by user
Feb 23 09:43:28 np0005626463.localdomain sshd[286095]: Disconnected from user tripleo-admin 192.168.122.11 port 48530
Feb 23 09:43:28 np0005626463.localdomain sshd[286074]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 23 09:43:28 np0005626463.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Feb 23 09:43:28 np0005626463.localdomain systemd[1]: session-62.scope: Consumed 1.283s CPU time.
Feb 23 09:43:28 np0005626463.localdomain systemd-logind[759]: Session 62 logged out. Waiting for processes to exit.
Feb 23 09:43:28 np0005626463.localdomain systemd-logind[759]: Removed session 62.
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 15 from mon.1
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map state change up:reconnect --> up:rejoin
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.0.13 rejoin_start
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.0.13 rejoin_joint_start
Feb 23 09:43:28 np0005626463.localdomain ceph-mds[286877]: mds.0.13 rejoin_done
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 16 from mon.1
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.0.13 handle_mds_map state change up:rejoin --> up:active
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.0.13 recovery_done -- successful recovery!
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.0.13 active_start
Feb 23 09:43:29 np0005626463.localdomain ceph-mds[286877]: mds.0.13 cluster recovered.
Feb 23 09:43:31 np0005626463.localdomain ceph-mds[286877]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Feb 23 09:43:31 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc[286873]: 2026-02-23T09:43:30.999+0000 7f2bb035b640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.411 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.448 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:33.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:43:33 np0005626463.localdomain podman[287491]: 2026-02-23 09:43:33.895324512 +0000 UTC m=+0.070294336 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:43:33 np0005626463.localdomain podman[287491]: 2026-02-23 09:43:33.904346094 +0000 UTC m=+0.079315988 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:43:33 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:38.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Activating special unit Exit the Session...
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped target Main User Target.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped target Basic System.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped target Paths.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped target Sockets.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped target Timers.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Closed D-Bus User Message Bus Socket.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Stopped Create User's Volatile Files and Directories.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Removed slice User Application Slice.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Reached target Shutdown.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Finished Exit the Session.
Feb 23 09:43:38 np0005626463.localdomain systemd[286078]: Reached target Exit the Session.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 23 09:43:38 np0005626463.localdomain systemd[1]: user-1003.slice: Consumed 1.633s CPU time.
Feb 23 09:43:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:43:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:43:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:43:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1"
Feb 23 09:43:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:43:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17295 "" "Go-http-client/1.1"
Feb 23 09:43:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:43:40 np0005626463.localdomain systemd[1]: tmp-crun.GxErOw.mount: Deactivated successfully.
Feb 23 09:43:40 np0005626463.localdomain podman[287516]: 2026-02-23 09:43:40.907092437 +0000 UTC m=+0.083046083 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:43:40 np0005626463.localdomain podman[287516]: 2026-02-23 09:43:40.94434157 +0000 UTC m=+0.120295186 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter)
Feb 23 09:43:40 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:43:42 np0005626463.localdomain sudo[287535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:43:42 np0005626463.localdomain sudo[287535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:42 np0005626463.localdomain sudo[287535]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:43:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:43:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:43:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:43:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:43:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.488 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.490 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:43.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:44 np0005626463.localdomain sudo[287553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:43:44 np0005626463.localdomain sudo[287553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:44 np0005626463.localdomain sudo[287553]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:44 np0005626463.localdomain sshd[287571]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:43:45 np0005626463.localdomain sudo[287572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:43:45 np0005626463.localdomain sudo[287572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:43:45 np0005626463.localdomain sudo[287572]: pam_unix(sudo:session): session closed for user root
Feb 23 09:43:45 np0005626463.localdomain sshd[287571]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:43:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:43:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:43:46 np0005626463.localdomain podman[287591]: 2026-02-23 09:43:46.909590767 +0000 UTC m=+0.084654794 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:43:46 np0005626463.localdomain podman[287591]: 2026-02-23 09:43:46.944273639 +0000 UTC m=+0.119337686 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:43:46 np0005626463.localdomain systemd[1]: tmp-crun.LGTarI.mount: Deactivated successfully.
Feb 23 09:43:46 np0005626463.localdomain podman[287592]: 2026-02-23 09:43:46.958934348 +0000 UTC m=+0.130926369 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:43:46 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:43:46 np0005626463.localdomain podman[287592]: 2026-02-23 09:43:46.996524021 +0000 UTC m=+0.168516042 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:43:47 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.493 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.524 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:48.526 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:43:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:43:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:43:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:43:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:43:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:43:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:43:49 np0005626463.localdomain podman[287640]: 2026-02-23 09:43:49.93308932 +0000 UTC m=+0.108661973 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 09:43:49 np0005626463.localdomain podman[287640]: 2026-02-23 09:43:49.946343234 +0000 UTC m=+0.121915837 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:43:49 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.527 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.556 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:53.557 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:43:54 np0005626463.localdomain podman[287658]: 2026-02-23 09:43:54.945647739 +0000 UTC m=+0.116517099 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 09:43:54 np0005626463.localdomain podman[287658]: 2026-02-23 09:43:54.975245512 +0000 UTC m=+0.146114882 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:43:54 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.558 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.590 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:43:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:43:58.593 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.593 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.596 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:03.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:44:04 np0005626463.localdomain podman[287675]: 2026-02-23 09:44:04.903389089 +0000 UTC m=+0.079740040 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:44:04 np0005626463.localdomain podman[287675]: 2026-02-23 09:44:04.915131405 +0000 UTC m=+0.091482386 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:44:04 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:44:06 np0005626463.localdomain sudo[287698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:06 np0005626463.localdomain sudo[287698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:06 np0005626463.localdomain sudo[287698]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:08 np0005626463.localdomain sshd[287716]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:08.664 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:08 np0005626463.localdomain sudo[287718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:08 np0005626463.localdomain sudo[287718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:08 np0005626463.localdomain sudo[287718]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:08 np0005626463.localdomain sshd[287716]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:44:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:44:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:44:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:44:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1"
Feb 23 09:44:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:44:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17296 "" "Go-http-client/1.1"
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.705 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.705 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.706 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:44:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:09.706 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:44:09 np0005626463.localdomain sudo[287736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:09 np0005626463.localdomain sudo[287736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:09 np0005626463.localdomain sudo[287736]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.195 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.216 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.217 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.218 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.218 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.236 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.236 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.237 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:44:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:10.300 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:10 np0005626463.localdomain sudo[287754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:10 np0005626463.localdomain sudo[287754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:10 np0005626463.localdomain sudo[287754]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:11 np0005626463.localdomain sudo[287772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:11 np0005626463.localdomain sudo[287772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: tmp-crun.snwVdA.mount: Deactivated successfully.
Feb 23 09:44:11 np0005626463.localdomain podman[287790]: 2026-02-23 09:44:11.131131989 +0000 UTC m=+0.072041140 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:44:11 np0005626463.localdomain podman[287790]: 2026-02-23 09:44:11.140254663 +0000 UTC m=+0.081163834 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:44:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:11.183 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.613412594 +0000 UTC m=+0.076421316 container create f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: Started libpod-conmon-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope.
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.583579953 +0000 UTC m=+0.046588705 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.6908009 +0000 UTC m=+0.153809622 container init f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, release=1770267347, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.701265427 +0000 UTC m=+0.164274149 container start f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.701551246 +0000 UTC m=+0.164560048 container attach f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:44:11 np0005626463.localdomain friendly_moser[287868]: 167 167
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: libpod-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope: Deactivated successfully.
Feb 23 09:44:11 np0005626463.localdomain podman[287853]: 2026-02-23 09:44:11.705941872 +0000 UTC m=+0.168950644 container died f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7)
Feb 23 09:44:11 np0005626463.localdomain podman[287873]: 2026-02-23 09:44:11.825964969 +0000 UTC m=+0.107613760 container remove f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: libpod-conmon-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope: Deactivated successfully.
Feb 23 09:44:11 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:44:11 np0005626463.localdomain systemd-rc-local-generator[287913]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:44:12 np0005626463.localdomain systemd-sysv-generator[287919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:12.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:12.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:12.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-d05beb6bd41232dd856a5c5ba15720c8e82328032deb98cbf61da7e6a6c1578c-merged.mount: Deactivated successfully.
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:44:12 np0005626463.localdomain systemd-sysv-generator[287958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:44:12 np0005626463.localdomain systemd-rc-local-generator[287954]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: Starting Ceph mgr.np0005626463.wtksup for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 09:44:12 np0005626463.localdomain podman[288018]: 
Feb 23 09:44:12 np0005626463.localdomain podman[288018]: 2026-02-23 09:44:12.861437894 +0000 UTC m=+0.056254047 container create bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, architecture=x86_64, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.)
Feb 23 09:44:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/lib/ceph/mgr/ceph-np0005626463.wtksup supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:12 np0005626463.localdomain podman[288018]: 2026-02-23 09:44:12.911888979 +0000 UTC m=+0.106705122 container init bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, release=1770267347, version=7, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=)
Feb 23 09:44:12 np0005626463.localdomain podman[288018]: 2026-02-23 09:44:12.920526249 +0000 UTC m=+0.115342402 container start bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph)
Feb 23 09:44:12 np0005626463.localdomain bash[288018]: bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4
Feb 23 09:44:12 np0005626463.localdomain podman[288018]: 2026-02-23 09:44:12.838969493 +0000 UTC m=+0.033785636 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:44:12 np0005626463.localdomain systemd[1]: Started Ceph mgr.np0005626463.wtksup for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 09:44:12 np0005626463.localdomain ceph-mgr[288036]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 09:44:12 np0005626463.localdomain ceph-mgr[288036]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2
Feb 23 09:44:12 np0005626463.localdomain ceph-mgr[288036]: pidfile_write: ignore empty --pid-file
Feb 23 09:44:12 np0005626463.localdomain sudo[287772]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:12 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'alerts'
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'balancer'
Feb 23 09:44:13 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.078+0000 7f4486b65140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.086 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.086 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'cephadm'
Feb 23 09:44:13 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.144+0000 7f4486b65140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:44:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:44:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:44:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:44:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:44:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.584 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.647 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.648 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.664 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'crash'
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'dashboard'
Feb 23 09:44:13 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.844+0000 7f4486b65140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.858 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.860 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12256MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.860 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.861 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.963 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.964 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:44:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:13.964 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.016 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.107 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.108 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.131 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.160 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.209 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'devicehealth'
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'diskprediction_local'
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.386+0000 7f4486b65140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]:   from numpy import show_config as show_numpy_config
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'influx'
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.528+0000 7f4486b65140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'insights'
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.587+0000 7f4486b65140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'iostat'
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.661 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.669 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.688 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.690 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:44:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:14.691 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'k8sevents'
Feb 23 09:44:14 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.699+0000 7f4486b65140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 23 09:44:14 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'localpool'
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'mds_autoscaler'
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'mirroring'
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'nfs'
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'orchestrator'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.399+0000 7f4486b65140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'osd_perf_query'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.538+0000 7f4486b65140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'osd_support'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.601+0000 7f4486b65140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'pg_autoscaler'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.655+0000 7f4486b65140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:15.688 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'progress'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.720+0000 7f4486b65140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 23 09:44:15 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'prometheus'
Feb 23 09:44:15 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.777+0000 7f4486b65140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rbd_support'
Feb 23 09:44:16 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.066+0000 7f4486b65140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'restful'
Feb 23 09:44:16 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.145+0000 7f4486b65140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rgw'
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rook'
Feb 23 09:44:16 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.503+0000 7f4486b65140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'selftest'
Feb 23 09:44:16 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.908+0000 7f4486b65140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 23 09:44:16 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'snap_schedule'
Feb 23 09:44:16 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.969+0000 7f4486b65140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'stats'
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'status'
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'telegraf'
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.160+0000 7f4486b65140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'telemetry'
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.218+0000 7f4486b65140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'test_orchestrator'
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.351+0000 7f4486b65140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'volumes'
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.498+0000 7f4486b65140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'zabbix'
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.683+0000 7f4486b65140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.743+0000 7f4486b65140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 23 09:44:17 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675
Feb 23 09:44:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:44:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:44:17 np0005626463.localdomain sudo[288110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:17 np0005626463.localdomain sudo[288110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:17 np0005626463.localdomain sudo[288110]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:17 np0005626463.localdomain podman[288126]: 2026-02-23 09:44:17.932100998 +0000 UTC m=+0.097114230 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:44:18 np0005626463.localdomain systemd[1]: tmp-crun.V3arwi.mount: Deactivated successfully.
Feb 23 09:44:18 np0005626463.localdomain podman[288127]: 2026-02-23 09:44:18.011561724 +0000 UTC m=+0.176929587 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:44:18 np0005626463.localdomain podman[288126]: 2026-02-23 09:44:18.015226007 +0000 UTC m=+0.180239259 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Feb 23 09:44:18 np0005626463.localdomain podman[288127]: 2026-02-23 09:44:18.023182251 +0000 UTC m=+0.188550074 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:44:18 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:44:18 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:44:18 np0005626463.localdomain sudo[288175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:18 np0005626463.localdomain sudo[288175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:18 np0005626463.localdomain sudo[288175]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:18 np0005626463.localdomain sudo[288193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:44:18 np0005626463.localdomain sudo[288193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:18.667 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:19 np0005626463.localdomain systemd[1]: tmp-crun.tI7NCp.mount: Deactivated successfully.
Feb 23 09:44:19 np0005626463.localdomain podman[288281]: 2026-02-23 09:44:19.00982418 +0000 UTC m=+0.099975608 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 23 09:44:19 np0005626463.localdomain podman[288281]: 2026-02-23 09:44:19.109414273 +0000 UTC m=+0.199565711 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public)
Feb 23 09:44:19 np0005626463.localdomain sudo[288193]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:19 np0005626463.localdomain sudo[288383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:19 np0005626463.localdomain sudo[288383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:19 np0005626463.localdomain sudo[288383]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:19 np0005626463.localdomain sudo[288401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:44:19 np0005626463.localdomain sudo[288401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:20 np0005626463.localdomain sudo[288401]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:44:20 np0005626463.localdomain sudo[288452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:20 np0005626463.localdomain sudo[288452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:20 np0005626463.localdomain sudo[288452]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:20 np0005626463.localdomain podman[288466]: 2026-02-23 09:44:20.914918665 +0000 UTC m=+0.086631858 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:44:20 np0005626463.localdomain podman[288466]: 2026-02-23 09:44:20.930370318 +0000 UTC m=+0.102083501 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:44:20 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:44:21 np0005626463.localdomain sshd[288487]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:21 np0005626463.localdomain sudo[288488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:21 np0005626463.localdomain sudo[288488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:21 np0005626463.localdomain sudo[288488]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:21 np0005626463.localdomain sshd[288487]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:44:22 np0005626463.localdomain sudo[288507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:44:22 np0005626463.localdomain sudo[288507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288507]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:44:22 np0005626463.localdomain sudo[288525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288525]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:22 np0005626463.localdomain sudo[288543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288543]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:22 np0005626463.localdomain sudo[288561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288561]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:22 np0005626463.localdomain sudo[288579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288579]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:22 np0005626463.localdomain sudo[288613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288613]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:22 np0005626463.localdomain sudo[288631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288631]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:22 np0005626463.localdomain sudo[288649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:44:22 np0005626463.localdomain sudo[288649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:22 np0005626463.localdomain sudo[288649]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:23 np0005626463.localdomain sudo[288667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288667]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:23 np0005626463.localdomain sudo[288685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288685]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288703]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:23 np0005626463.localdomain sudo[288721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288721]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288739]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288773]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288791]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:23 np0005626463.localdomain sudo[288809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288809]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:23 np0005626463.localdomain sudo[288827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.674 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:23 np0005626463.localdomain sudo[288827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:23 np0005626463.localdomain sudo[288827]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.716 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:23.717 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:23 np0005626463.localdomain sudo[288845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:44:23 np0005626463.localdomain sudo[288845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288845]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288863]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:23 np0005626463.localdomain sudo[288881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288881]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:23 np0005626463.localdomain sudo[288899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:44:23 np0005626463.localdomain sudo[288899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:23 np0005626463.localdomain sudo[288899]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[288933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[288933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[288933]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[288951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[288951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[288951]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[288969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:44:24 np0005626463.localdomain sudo[288969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[288969]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[288987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:24 np0005626463.localdomain sudo[288987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[288987]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:24 np0005626463.localdomain sudo[289005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289005]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[289023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289023]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:24 np0005626463.localdomain sudo[289041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289041]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[289059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289059]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[289093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289093]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:44:24 np0005626463.localdomain sudo[289111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289111]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:24 np0005626463.localdomain sudo[289129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:44:24 np0005626463.localdomain sudo[289129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:24 np0005626463.localdomain sudo[289129]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:25 np0005626463.localdomain sudo[289147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:25 np0005626463.localdomain sudo[289147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:44:25 np0005626463.localdomain sudo[289147]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:25 np0005626463.localdomain podman[289164]: 2026-02-23 09:44:25.374033375 +0000 UTC m=+0.085474872 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:44:25 np0005626463.localdomain podman[289164]: 2026-02-23 09:44:25.379897256 +0000 UTC m=+0.091338773 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:44:25 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.718 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.762 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:28.763 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:31 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e4f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.763 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.816 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:33.816 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:44:35 np0005626463.localdomain podman[289184]: 2026-02-23 09:44:35.903443682 +0000 UTC m=+0.078870970 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:44:35 np0005626463.localdomain podman[289184]: 2026-02-23 09:44:35.915150931 +0000 UTC m=+0.090578209 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:44:35 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:44:36 np0005626463.localdomain ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors
Feb 23 09:44:37 np0005626463.localdomain sudo[289208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:37 np0005626463.localdomain sudo[289208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:37 np0005626463.localdomain sudo[289208]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:37 np0005626463.localdomain sudo[289226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:37 np0005626463.localdomain sudo[289226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 2026-02-23 09:44:37.598394322 +0000 UTC m=+0.072693001 container create 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, architecture=x86_64, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True)
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: Started libpod-conmon-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope.
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 2026-02-23 09:44:37.567078822 +0000 UTC m=+0.041377541 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 2026-02-23 09:44:37.676709434 +0000 UTC m=+0.151008113 container init 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, vcs-type=git, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.buildah.version=1.42.2, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 2026-02-23 09:44:37.68634638 +0000 UTC m=+0.160645079 container start 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main)
Feb 23 09:44:37 np0005626463.localdomain podman[289287]: 2026-02-23 09:44:37.686639589 +0000 UTC m=+0.160945148 container attach 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Feb 23 09:44:37 np0005626463.localdomain compassionate_bhaskara[289302]: 167 167
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: libpod-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope: Deactivated successfully.
Feb 23 09:44:37 np0005626463.localdomain podman[289307]: 2026-02-23 09:44:37.760838164 +0000 UTC m=+0.056363760 container died 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, release=1770267347, name=rhceph, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7)
Feb 23 09:44:37 np0005626463.localdomain podman[289307]: 2026-02-23 09:44:37.795255959 +0000 UTC m=+0.090781495 container remove 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, distribution-scope=public, vcs-type=git, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2)
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: libpod-conmon-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope: Deactivated successfully.
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:37.89407008 +0000 UTC m=+0.068766620 container create a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc.)
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: Started libpod-conmon-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope.
Feb 23 09:44:37 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:44:37 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:37 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:37 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:37 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:37.948664184 +0000 UTC m=+0.123360724 container init a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:37.958276509 +0000 UTC m=+0.132973049 container start a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:37.958518456 +0000 UTC m=+0.133215026 container attach a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main)
Feb 23 09:44:37 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:37.869746074 +0000 UTC m=+0.044442614 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: libpod-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope: Deactivated successfully.
Feb 23 09:44:38 np0005626463.localdomain podman[289324]: 2026-02-23 09:44:38.054656505 +0000 UTC m=+0.229353075 container died a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, architecture=x86_64)
Feb 23 09:44:38 np0005626463.localdomain podman[289366]: 2026-02-23 09:44:38.143722806 +0000 UTC m=+0.079869331 container remove a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, name=rhceph, distribution-scope=public)
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: libpod-conmon-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope: Deactivated successfully.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:44:38 np0005626463.localdomain systemd-sysv-generator[289405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:44:38 np0005626463.localdomain systemd-rc-local-generator[289402]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5080 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-43ba382414e8e5720309cffa5ee892596c9f325953822f0ed270cb7544612577-merged.mount: Deactivated successfully.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:44:38 np0005626463.localdomain systemd-rc-local-generator[289445]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:44:38 np0005626463.localdomain systemd-sysv-generator[289452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.868 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5055 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:38.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:38 np0005626463.localdomain systemd[1]: Starting Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 09:44:39 np0005626463.localdomain podman[289510]: 
Feb 23 09:44:39 np0005626463.localdomain podman[289510]: 2026-02-23 09:44:39.231729273 +0000 UTC m=+0.073881807 container create 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, release=1770267347, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z)
Feb 23 09:44:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:39 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff)
Feb 23 09:44:39 np0005626463.localdomain podman[289510]: 2026-02-23 09:44:39.201465014 +0000 UTC m=+0.043617568 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:44:39 np0005626463.localdomain podman[289510]: 2026-02-23 09:44:39.303825544 +0000 UTC m=+0.145978078 container init 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, RELEASE=main, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main)
Feb 23 09:44:39 np0005626463.localdomain podman[289510]: 2026-02-23 09:44:39.315591305 +0000 UTC m=+0.157743849 container start 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z)
Feb 23 09:44:39 np0005626463.localdomain bash[289510]: 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301
Feb 23 09:44:39 np0005626463.localdomain systemd[1]: Started Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pidfile_write: ignore empty --pid-file
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: load: jerasure load: lrc 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: RocksDB version: 7.9.2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Git sha 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: DB SUMMARY
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: DB Session ID:  5E7LZX0BBD5RWYSEID7U
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: CURRENT file:  CURRENT
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005626463/store.db dir, Total Num: 0, files: 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005626463/store.db: 000004.log size: 886 ; 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                         Options.error_if_exists: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.create_if_missing: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 09:44:39 np0005626463.localdomain sudo[289226]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                                     Options.env: 0x5571fa965a20
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                                Options.info_log: 0x5571fbc72d20
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                              Options.statistics: (nil)
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                               Options.use_fsync: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                              Options.db_log_dir: 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                                 Options.wal_dir: 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                    Options.write_buffer_manager: 0x5571fbc83540
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.unordered_write: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                               Options.row_cache: None
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                              Options.wal_filter: None
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.two_write_queues: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.wal_compression: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.atomic_flush: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.max_background_jobs: 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.max_background_compactions: -1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.max_subcompactions: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.max_total_wal_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                          Options.max_open_files: -1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:       Options.compaction_readahead_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Compression algorithms supported:
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kZSTD supported: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kXpressCompression supported: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kBZip2Compression supported: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kLZ4Compression supported: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kZlibCompression supported: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         kSnappyCompression supported: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:           Options.merge_operator: 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:        Options.compaction_filter: None
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5571fbc72980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x5571fbc6f350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:        Options.write_buffer_size: 33554432
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:  Options.max_write_buffer_number: 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.compression: NoCompression
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.num_levels: 7
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                           Options.bloom_locality: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                               Options.ttl: 2592000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                       Options.enable_blob_files: false
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                           Options.min_blob_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3d1e4b58-ab15-4081-a9da-984e46fdc8b2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879374784, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879377093, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879377207, "job": 1, "event": "recovery_finished"}
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5571fbc96e00
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: DB pointer 0x5571fbd8c000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463 does not exist in monmap, will attempt to join an existing cluster
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5571fbc6f350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.4e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,1.08 KB,0.000205636%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0]
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: starting mon.np0005626463 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005626463 fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(???) e0 preinit fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:44:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing) e5 sync_obtain_latest_monmap
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5
Feb 23 09:44:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:44:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 23 09:44:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:44:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18253 "" "Go-http-client/1.1"
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).mds e17 new map
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-23T07:57:46.097663+0000
                                                           modified        2026-02-23T09:43:29.529267+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26518}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26518 members: 26518
                                                           [mds.mds.np0005626463.qcthuc{0:26518} state up:active seq 13 addr [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005626465.drvnoy{-1:26498} state up:standby seq 1 addr [v2:172.18.0.107:6808/2939113664,v1:172.18.0.107:6809/2939113664] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005626466.vaywlp{-1:26506} state up:standby seq 1 addr [v2:172.18.0.108:6808/2035422599,v1:172.18.0.108:6809/2035422599] compat {c=[1],r=[1],i=[17ff]}]
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 3314933000854323200, adjusting msgr requires
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3894: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3895: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3896: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3897: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3898: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3899: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3900: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.32:0/1992532605' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.32:0/1992532605' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3901: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17007 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626463.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mgr to host np0005626463.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626463.localdomain to  3396M
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626465.localdomain to  3396M
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626466.localdomain to  3396M
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3902: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17013 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626465.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mgr to host np0005626465.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3903: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17019 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626466.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mgr to host np0005626466.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.107:0/2394799108' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.108:0/4249322941' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17034 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Saving service mgr spec with placement label:mgr
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3904: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.107:0/1981017370' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.108:0/374077977' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17052 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3905: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.106:0/198086823' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17070 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626459.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626459.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.106:0/3797074821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17082 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626459.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626459.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3906: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626460.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626460.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3907: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626460.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626460.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Standby manager daemon np0005626463.wtksup started
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17112 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626461.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626461.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mgrmap e12: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626463.wtksup
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3908: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17118 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626461.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626461.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Standby manager daemon np0005626465.hlpkwo started
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mgrmap e13: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17124 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626463.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626463.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3909: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626463.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626463.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Standby manager daemon np0005626466.nisqfq started
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3910: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mgrmap e14: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17136 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626465.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626465.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17142 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626465.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626465.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3911: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626466.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label mon to host np0005626466.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17154 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005626466.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Added label _admin to host np0005626466.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3912: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17160 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Saving service mon spec with placement label:mon
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3913: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='client.17166 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626463", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626459"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 calling monitor election
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 calling monitor election
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626460 calling monitor election
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3915: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626466 calling monitor election
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3916: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466 in quorum (ranks 0,1,2,3)
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: monmap epoch 4
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: last_changed 2026-02-23T09:44:31.774735+0000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: min_mon_release 18 (reef)
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: election_strategy: 1
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626459
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: osdmap e80: 6 total, 6 up, 6 in
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mgrmap e14: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: overall HEALTH_OK
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: Deploying daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: pgmap v3917: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Feb 23 09:44:39 np0005626463.localdomain systemd[1]: tmp-crun.ITiuMp.mount: Deactivated successfully.
Feb 23 09:44:39 np0005626463.localdomain sshd[289569]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:41 np0005626463.localdomain sshd[289569]: Invalid user user from 185.156.73.233 port 60190
Feb 23 09:44:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:44:41 np0005626463.localdomain systemd[1]: tmp-crun.uJOoUB.mount: Deactivated successfully.
Feb 23 09:44:41 np0005626463.localdomain podman[289571]: 2026-02-23 09:44:41.313258109 +0000 UTC m=+0.084441031 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Feb 23 09:44:41 np0005626463.localdomain podman[289571]: 2026-02-23 09:44:41.325459094 +0000 UTC m=+0.096642066 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:44:41 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:44:41 np0005626463.localdomain sshd[289569]: Connection closed by invalid user user 185.156.73.233 port 60190 [preauth]
Feb 23 09:44:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:44:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:44:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:44:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:44:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:44:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:44:43 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 23 09:44:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:43.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:43.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:43 np0005626463.localdomain sudo[289593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:43 np0005626463.localdomain sudo[289593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:43 np0005626463.localdomain sudo[289593]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:44 np0005626463.localdomain sudo[289611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:44 np0005626463.localdomain sudo[289611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:44 np0005626463.localdomain sudo[289611]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:44 np0005626463.localdomain sudo[289629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:44:44 np0005626463.localdomain sudo[289629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:45 np0005626463.localdomain podman[289721]: 2026-02-23 09:44:45.021610317 +0000 UTC m=+0.073809685 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, name=rhceph)
Feb 23 09:44:45 np0005626463.localdomain podman[289721]: 2026-02-23 09:44:45.1074557 +0000 UTC m=+0.159655058 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:44:45 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@-1(probing) e6  my rank is now 5 (was -1)
Feb 23 09:44:45 np0005626463.localdomain ceph-mon[289530]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:44:45 np0005626463.localdomain ceph-mon[289530]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 23 09:44:45 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:44:45 np0005626463.localdomain sudo[289629]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:48 np0005626463.localdomain ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors
Feb 23 09:44:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:44:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:44:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:44:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:44:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:44:48.547 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626459"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626460 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626466 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: pgmap v3918: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626465 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: pgmap v3919: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: pgmap v3920: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466,np0005626465 in quorum (ranks 0,1,2,3,4)
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: monmap epoch 5
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: last_changed 2026-02-23T09:44:38.373552+0000
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: min_mon_release 18 (reef)
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: election_strategy: 1
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626459
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: osdmap e80: 6 total, 6 up, 6 in
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mgrmap e14: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: overall HEALTH_OK
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mgrc update_daemon_metadata mon.np0005626463 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005626463.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005626463.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626459"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626460 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626466 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626465 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='client.17180 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626463", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: pgmap v3921: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463 calling monitor election
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: pgmap v3922: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4,5)
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: monmap epoch 6
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: last_changed 2026-02-23T09:44:43.582605+0000
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: min_mon_release 18 (reef)
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: election_strategy: 1
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626459
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005626463
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: osdmap e80: 6 total, 6 up, 6 in
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: mgrmap e14: np0005626459.pmtxxl(active, since 2h), standbys: np0005626461.lrfquh, np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: overall HEALTH_OK
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:48 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:44:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:44:48 np0005626463.localdomain sudo[289844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:44:48 np0005626463.localdomain sudo[289844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:48 np0005626463.localdomain sudo[289844]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.909 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:48.910 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:48 np0005626463.localdomain podman[289848]: 2026-02-23 09:44:48.941697148 +0000 UTC m=+0.108784108 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 09:44:48 np0005626463.localdomain sudo[289885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:44:48 np0005626463.localdomain sudo[289885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:48 np0005626463.localdomain sudo[289885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:48 np0005626463.localdomain podman[289848]: 2026-02-23 09:44:48.977167836 +0000 UTC m=+0.144254796 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:44:48 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:44:48 np0005626463.localdomain podman[289850]: 2026-02-23 09:44:48.989121602 +0000 UTC m=+0.155694925 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:44:48 np0005626463.localdomain podman[289850]: 2026-02-23 09:44:48.997274362 +0000 UTC m=+0.163847715 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:44:49 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:44:49 np0005626463.localdomain sudo[289919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[289919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[289919]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[289944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:49 np0005626463.localdomain sudo[289944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[289944]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[289962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[289962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[289962]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[289996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[289996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[289996]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[290014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290014]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain sudo[290032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290032]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:49 np0005626463.localdomain sudo[290050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290050]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:44:49 np0005626463.localdomain sudo[290068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290068]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[290086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290086]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: pgmap v3923: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:49 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:49 np0005626463.localdomain sudo[290104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:44:49 np0005626463.localdomain sudo[290104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290104]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:49 np0005626463.localdomain sudo[290122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:49 np0005626463.localdomain sudo[290122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:49 np0005626463.localdomain sudo[290122]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:50 np0005626463.localdomain sudo[290156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:50 np0005626463.localdomain sudo[290156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:50 np0005626463.localdomain sudo[290156]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:50 np0005626463.localdomain sudo[290174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:44:50 np0005626463.localdomain sudo[290174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:50 np0005626463.localdomain sudo[290174]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:50 np0005626463.localdomain sudo[290192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:50 np0005626463.localdomain sudo[290192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:50 np0005626463.localdomain sudo[290192]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626463", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:51 np0005626463.localdomain sudo[290210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:44:51 np0005626463.localdomain sudo[290210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:44:51 np0005626463.localdomain sudo[290210]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:51 np0005626463.localdomain podman[290228]: 2026-02-23 09:44:51.750955532 +0000 UTC m=+0.078771307 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Feb 23 09:44:51 np0005626463.localdomain podman[290228]: 2026-02-23 09:44:51.787215844 +0000 UTC m=+0.115031649 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:44:51 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: pgmap v3924: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:44:52 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='client.34101 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626465", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626459 (monmap changed)...
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626459 on np0005626459.localdomain
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626459.pmtxxl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:44:53 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:53.955 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='client.17202 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626466", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626459.pmtxxl (monmap changed)...
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626459.pmtxxl on np0005626459.localdomain
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: pgmap v3925: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626459", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:54 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.103:0/3510382730' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:44:54 np0005626463.localdomain sshd[290247]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626459 (monmap changed)...
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626459 on np0005626459.localdomain
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626460 (monmap changed)...
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain
Feb 23 09:44:55 np0005626463.localdomain ceph-mon[289530]: pgmap v3926: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:44:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:44:55 np0005626463.localdomain sshd[290247]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:44:55 np0005626463.localdomain systemd[1]: tmp-crun.JFkVLa.mount: Deactivated successfully.
Feb 23 09:44:55 np0005626463.localdomain podman[290249]: 2026-02-23 09:44:55.9191001 +0000 UTC m=+0.089533307 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:44:55 np0005626463.localdomain podman[290249]: 2026-02-23 09:44:55.924496376 +0000 UTC m=+0.094929573 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:44:55 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.139 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '027a236c-9661-4136-803c-165ba7fd8164', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.135122', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4f967c12-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '6238137ca37243738933269c579d870477fda1ad7cc992ca17b67c91e24bfd1f'}]}, 'timestamp': '2026-02-23 09:44:56.140389', '_unique_id': 'f0ee13f090ed4be298b84af87e7ed130'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89c3c41d-8857-4870-8de2-1fa858259751', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.143215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f98cd8c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '624d8d0631ebbfed8817452f34d8460c7fab70cc623556e03f4ae83c34b72957'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.143215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f98e006-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'ddf55727eafa165a094b3246dcf9c6ef07590f8c4d97a2347e9058eeb9efe3aa'}]}, 'timestamp': '2026-02-23 09:44:56.155998', '_unique_id': 'cd38012e6ec846738ecd402f9495cda9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21185ca3-c333-452f-aba3-0676b763c79d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.158259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9d93ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '7c828b2b36fde706adf33b45f1ad6ea48a16da44c97c87c418552fefced35fcb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.158259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9da6a4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '65482ee50f59f07e88b2b6a0fc2030779ec71f7274bd011e381838304b6f879c'}]}, 'timestamp': '2026-02-23 09:44:56.187254', '_unique_id': '08a4e5c1e4cf4378aec3d8b316ef66e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.190 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c1d193d-c9d9-429c-a251-24f3d81baeb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.189660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9e1684-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '99de27b17fa88bab74b5b42cb53f560a0dc373dd88e58cc9fc5b4c38b10ab410'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.189660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9e26e2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '11a56b19fc4d15a83c4b0570169d3f389c74f19bd7d0a5442931eedac1aa0412'}]}, 'timestamp': '2026-02-23 09:44:56.190532', '_unique_id': '284b783755604eb4abfdbb419de336d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a63f45a-a00c-46ab-ba35-9e02af07d3a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.192702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9e8d1c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '7a93d3822e5bd45cc7c96920b1e01be2c8216e748203e8f1d4807371aada9de8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.192702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9e9d34-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b3f1bbb0ab748e841eea288b36576f523f09f6a2e75828a2a208d2b48d7e517c'}]}, 'timestamp': '2026-02-23 09:44:56.193560', '_unique_id': '498f8f38041e47968df8b744381277f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0198b8c6-ad21-4c95-8f7c-a7280ef24fa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.195705', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4f9f030a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '7c00f5ccdcb3a3e33f73221aab28c600137c6ce32d5451fb13aee6560ac8a55d'}]}, 'timestamp': '2026-02-23 09:44:56.196198', '_unique_id': '64a17d6a76ed47bcb7d9e5a5df774704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e6f7d46-3abe-4fbb-a09c-8148e2b2a98b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:44:56.198598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4fa21234-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.404869499, 'message_signature': 'a4c0699b1a6d3ab768c56849fe0ad0667edcb71d1e82737af91c4e261ca758d7'}]}, 'timestamp': '2026-02-23 09:44:56.216242', '_unique_id': '31d3d42863964401b40820a346bff034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8eff605-b157-48ac-a9ad-bb941c72addd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.218485', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa27b3e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'f052104f01dff4a6dd833077e53ade9ecf4c4302749d3f9c421070bf57cd9a62'}]}, 'timestamp': '2026-02-23 09:44:56.218971', '_unique_id': '7d80f1f4ab434eb0a8a9a61d97b2df37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '071cd756-3ad6-46e7-9d93-daa89f98a00e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.221062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa2dfac-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b5f864e15c7fcf651dd7bbefb7be72077e76bd32d1a34722f5ca728491a6bb48'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.221062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa2efa6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b26cfa7273841f2695aa4995b1ae7ccc31c3ffaa9894a87190f2ec6d62a8f4f7'}]}, 'timestamp': '2026-02-23 09:44:56.221917', '_unique_id': '8ba661806dd74c4babafdc28500e588d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfeeaa2b-03bd-4776-acb9-b215b28cd252', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.224053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa3546e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'ea5453b591eb4da59e0f16edcbcb7b54f57eebc488235700ebe0f445d79c63fd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.224053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa3645e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '619f70a5adb259d678c030a870f269e4c99ca602a769acf1e33801a0791d8184'}]}, 'timestamp': '2026-02-23 09:44:56.224901', '_unique_id': 'c61639a087b441f69baae7e0857f7aec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ccd1ff-eac6-4c36-83c1-283874889efe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.226986', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa3c75a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '3695ee5012151e6142cbd1efb81621dfebaa24a3faef6b1935d3f0475d8bb3ad'}]}, 'timestamp': '2026-02-23 09:44:56.227437', '_unique_id': '006f166cfb024fd69bcc1d3d338aebc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baf26159-88af-4897-923e-20de5f972f55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.229522', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa42a60-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'af854dbb9a319ab7bd2fa3aaf0d0439eec8f3d0f178eb96b772fc9e819911aa8'}]}, 'timestamp': '2026-02-23 09:44:56.230003', '_unique_id': 'af2cafe9ad864e9e8f8b1307f6367468'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '088bdd9c-2a80-4795-8375-2cb1889161da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.232064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa48d5c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '882661f88d1ea7ad5aeece9fe55a3416fc4528ad7f94d8e43a7ca4b0fb47b4fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.232064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa49d24-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'f66ad6facf8cc35504be4777df2d4ec0ccbdf884d2f4e5aa2ce6e7b241a67620'}]}, 'timestamp': '2026-02-23 09:44:56.232913', '_unique_id': '32d4e88c4f1b44d7836272c2b89958e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 10260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b841d89d-e494-4f06-861d-04bef9cf412d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10260000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:44:56.234995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4fa4ffe4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.404869499, 'message_signature': '01b14bd954b541621d36ed2d6c9b40c86808d69097f884f1abdbaf800cc1a824'}]}, 'timestamp': '2026-02-23 09:44:56.235423', '_unique_id': '9eaebfa4f36040a7b180162657fd79f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13806fb5-0dc7-4db7-a502-9620350eef91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.237479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa56100-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '22c75f3f1338fdd512b4e954242bcd3e466f22d420a15c091a61e442459ca36d'}]}, 'timestamp': '2026-02-23 09:44:56.237952', '_unique_id': '29bf2857f9c04e94a31aad2c612e9147'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '603307f6-7801-462b-83ba-2828d6437478', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.239999', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa5c352-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'aaea731d43c1fc7b1b91a6accab32e176264c6b11461b56182baf4873d6e9343'}]}, 'timestamp': '2026-02-23 09:44:56.240443', '_unique_id': 'd579a034a6504cb88b079be6b3f9eb63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b87bb83c-2e4d-4299-a6c0-ab735509bfa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.242480', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa62450-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '338dcf86f374ea7de5109accf72dfa85bbf29768304adf31b4a5de4f7c9a5640'}]}, 'timestamp': '2026-02-23 09:44:56.242956', '_unique_id': 'f078e1a74846402dbece4a975be99fa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ad0527-7c79-48e2-8bb1-d8f59fe1be87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.245257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa691a6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '168ab5211f2898a00d165cbdb97a8b6c20edc6a0c2817f84f1edb29ac2b35ce6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.245257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa6a394-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '23172a1295b73f18faf47a682f77db15d2193be86a01d112a61fe9eb05ca063a'}]}, 'timestamp': '2026-02-23 09:44:56.246171', '_unique_id': '22faf7c110844455b264b0fe1f60cb05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4786787-6780-4cb0-b626-defdbc977792', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.248441', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa70db6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '3b156e1a33d697ea648e8d79ea02ea96d887ec0180141353c60b97bceab80e2e'}]}, 'timestamp': '2026-02-23 09:44:56.248947', '_unique_id': '4fe362cd08534765bbca1c2479160ace'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e32d2a30-02d2-4ece-8c69-c216c5925185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.251487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa78638-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '13ad19c7d864c3e05e181dbb032b62cb9e5e608358891752d5671ecd81b08a73'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.251487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa79ab0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'b26dae3c3dd782f1885c6cfdc6eac93d265da57c0d98fc9fc30aaadb92e86ed6'}]}, 'timestamp': '2026-02-23 09:44:56.252419', '_unique_id': 'ce5fede0dab448a1a099cf3a7e30f368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59bd7020-e53a-4d01-ad23-2dbf0c542091', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.253736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa7d99e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '0a0f23ec5c015bc153119034254a2052a3b117af88dc391a4bcbe660341c2016'}]}, 'timestamp': '2026-02-23 09:44:56.254033', '_unique_id': '40ba977ea52b437a8700ed614633a571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:44:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626460 (monmap changed)...
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.103:0/1035755966' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 23 09:44:56 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/2046273284' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e80 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e80 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 e81: 6 total, 6 up, 6 in
Feb 23 09:44:57 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675
Feb 23 09:44:57 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675
Feb 23 09:44:57 np0005626463.localdomain sshd[26383]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain sshd[26233]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-26.scope: Consumed 3min 34.340s CPU time.
Feb 23 09:44:57 np0005626463.localdomain sshd[26290]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 26 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 18 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 21 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain sshd[26271]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain sshd[26192]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 20 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 26.
Feb 23 09:44:57 np0005626463.localdomain sshd[26175]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 18.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain sshd[26347]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 16 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain sshd[26309]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain sshd[26214]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 24 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain sshd[26328]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 14 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 17 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 22 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 23 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 21.
Feb 23 09:44:57 np0005626463.localdomain sshd[26364]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain sshd[26252]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 20.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 25 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Session 19 logged out. Waiting for processes to exit.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 16.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 24.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 17.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 22.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 14.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 23.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 25.
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: Removed session 19.
Feb 23 09:44:57 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' 
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.103:0/2046273284' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: Activating manager daemon np0005626461.lrfquh
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: mgrmap e15: np0005626461.lrfquh(active, starting, since 0.0573291s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626459"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: Manager daemon np0005626461.lrfquh is now available
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/mirror_snapshot_schedule"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/mirror_snapshot_schedule"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/trash_purge_schedule"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/trash_purge_schedule"} : dispatch
Feb 23 09:44:57 np0005626463.localdomain sshd[290268]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:57 np0005626463.localdomain sshd[290268]: Accepted publickey for ceph-admin from 192.168.122.105 port 51044 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:44:57 np0005626463.localdomain systemd-logind[759]: New session 64 of user ceph-admin.
Feb 23 09:44:57 np0005626463.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Feb 23 09:44:57 np0005626463.localdomain sshd[290268]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:44:57 np0005626463.localdomain sudo[290272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:44:57 np0005626463.localdomain sudo[290272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:57 np0005626463.localdomain sudo[290272]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:57 np0005626463.localdomain sudo[290290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:44:57 np0005626463.localdomain sudo[290290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:44:58 np0005626463.localdomain sshd[290330]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:44:58 np0005626463.localdomain podman[290379]: 2026-02-23 09:44:58.577015373 +0000 UTC m=+0.087488793 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:44:58 np0005626463.localdomain sshd[290330]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:44:58 np0005626463.localdomain podman[290379]: 2026-02-23 09:44:58.67702081 +0000 UTC m=+0.187494220 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, release=1770267347)
Feb 23 09:44:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:44:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:44:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:44:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:44:58.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:44:59 np0005626463.localdomain sudo[290290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:44:59 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1019728505 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:44:59 np0005626463.localdomain ceph-mon[289530]: mgrmap e16: np0005626461.lrfquh(active, since 1.08818s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:44:59 np0005626463.localdomain ceph-mon[289530]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:00 np0005626463.localdomain sudo[290497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:00 np0005626463.localdomain sudo[290497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:00 np0005626463.localdomain sudo[290497]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:00 np0005626463.localdomain sudo[290515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:45:00 np0005626463.localdomain sudo[290515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:00 np0005626463.localdomain sudo[290515]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Bus STARTING
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Serving on https://172.18.0.105:7150
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Client ('172.18.0.105', 36526) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Serving on http://172.18.0.105:8765
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Bus STARTED
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:00 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:01 np0005626463.localdomain sudo[290565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:01 np0005626463.localdomain sudo[290565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:01 np0005626463.localdomain sudo[290565]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:01 np0005626463.localdomain sudo[290583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:45:01 np0005626463.localdomain sudo[290583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:01 np0005626463.localdomain sudo[290583]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:45:02 np0005626463.localdomain sudo[290619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290619]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290637]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:45:02 np0005626463.localdomain sudo[290637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290637]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:02 np0005626463.localdomain sudo[290655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290655]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:02 np0005626463.localdomain sudo[290673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290673]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:02 np0005626463.localdomain sudo[290691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290691]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626459", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626459", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: mgrmap e17: np0005626461.lrfquh(active, since 4s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: Standby manager daemon np0005626459.pmtxxl started
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:45:02 np0005626463.localdomain sudo[290725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:02 np0005626463.localdomain sudo[290725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290725]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:02 np0005626463.localdomain sudo[290743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290743]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:45:02 np0005626463.localdomain sudo[290762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290762]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:02 np0005626463.localdomain sudo[290780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290780]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:02 np0005626463.localdomain sudo[290798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:02 np0005626463.localdomain sudo[290798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:02 np0005626463.localdomain sudo[290798]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:03 np0005626463.localdomain sudo[290816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290816]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:03 np0005626463.localdomain sudo[290834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290834]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:03 np0005626463.localdomain sudo[290852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290852]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:03 np0005626463.localdomain sudo[290886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290886]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:03 np0005626463.localdomain sudo[290904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290904]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain sudo[290923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290923]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:45:03 np0005626463.localdomain sudo[290941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290941]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: mgrmap e18: np0005626461.lrfquh(active, since 5s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:45:03 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr metadata", "who": "np0005626459.pmtxxl", "id": "np0005626459.pmtxxl"} : dispatch
Feb 23 09:45:03 np0005626463.localdomain sudo[290959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:45:03 np0005626463.localdomain sudo[290959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290959]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:45:03 np0005626463.localdomain sudo[290977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290977]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[290995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:03 np0005626463.localdomain sudo[290995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[290995]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:03 np0005626463.localdomain sudo[291013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:45:03 np0005626463.localdomain sudo[291013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:03 np0005626463.localdomain sudo[291013]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.044 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:04 np0005626463.localdomain sudo[291047]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.049 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:04.050 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:04 np0005626463.localdomain sudo[291065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291065]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain sudo[291083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291083]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:04 np0005626463.localdomain sudo[291101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291101]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:04 np0005626463.localdomain sudo[291119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291119]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1020048086 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:04 np0005626463.localdomain sudo[291137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291137]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:04 np0005626463.localdomain sudo[291155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291155]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291173]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.32:0/2675408206' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:45:04 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.32:0/2675408206' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:45:04 np0005626463.localdomain sudo[291207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291207]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:45:04 np0005626463.localdomain sudo[291225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291225]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:04 np0005626463.localdomain sudo[291243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:04 np0005626463.localdomain sudo[291243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:04 np0005626463.localdomain sudo[291243]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:05 np0005626463.localdomain sudo[291261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:45:05 np0005626463.localdomain sudo[291261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:05 np0005626463.localdomain sudo[291261]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:05 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)...
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:06 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:45:06 np0005626463.localdomain systemd[1]: tmp-crun.PTrCHb.mount: Deactivated successfully.
Feb 23 09:45:06 np0005626463.localdomain podman[291279]: 2026-02-23 09:45:06.925051829 +0000 UTC m=+0.098943736 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:45:06 np0005626463.localdomain podman[291279]: 2026-02-23 09:45:06.936224652 +0000 UTC m=+0.110116559 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:45:06 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:07 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:08 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.091 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:09.091 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:45:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1020054592 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:45:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 23 09:45:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:45:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1"
Feb 23 09:45:09 np0005626463.localdomain sudo[291302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:09 np0005626463.localdomain sudo[291302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:09 np0005626463.localdomain sudo[291302]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:09 np0005626463.localdomain sudo[291320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:09 np0005626463.localdomain sudo[291320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:09 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.077 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.083762499 +0000 UTC m=+0.082847141 container create 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:45:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope.
Feb 23 09:45:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.047503317 +0000 UTC m=+0.046588059 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.161189054 +0000 UTC m=+0.160273696 container init 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.42.2, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.171031235 +0000 UTC m=+0.170115887 container start 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.171305684 +0000 UTC m=+0.170390326 container attach 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 23 09:45:10 np0005626463.localdomain vibrant_allen[291371]: 167 167
Feb 23 09:45:10 np0005626463.localdomain systemd[1]: libpod-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope: Deactivated successfully.
Feb 23 09:45:10 np0005626463.localdomain podman[291356]: 2026-02-23 09:45:10.176042189 +0000 UTC m=+0.175126881 container died 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=)
Feb 23 09:45:10 np0005626463.localdomain podman[291376]: 2026-02-23 09:45:10.276206591 +0000 UTC m=+0.088684451 container remove 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, name=rhceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2)
Feb 23 09:45:10 np0005626463.localdomain systemd[1]: libpod-conmon-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope: Deactivated successfully.
Feb 23 09:45:10 np0005626463.localdomain sudo[291320]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:10 np0005626463.localdomain sudo[291393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:10 np0005626463.localdomain sudo[291393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:10 np0005626463.localdomain sudo[291393]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:10 np0005626463.localdomain sudo[291411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:10 np0005626463.localdomain sudo[291411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.718 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.720 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.720 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:45:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:10.721 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='client.17280 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.107:0/3622865160' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.108:0/4034397128' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.857786) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910857922, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10990, "num_deletes": 525, "total_data_size": 15118282, "memory_usage": 15768304, "flush_reason": "Manual Compaction"}
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910931901, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10764182, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10995, "table_properties": {"data_size": 10711520, "index_size": 27340, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24261, "raw_key_size": 255183, "raw_average_key_size": 26, "raw_value_size": 10548182, "raw_average_value_size": 1087, "num_data_blocks": 1030, "num_entries": 9701, "num_filter_entries": 9701, "num_deletions": 524, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 1771839879, "file_creation_time": 1771839910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 74249 microseconds, and 23974 cpu microseconds.
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.932042) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10764182 bytes OK
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.932097) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.933971) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.934001) EVENT_LOG_v1 {"time_micros": 1771839910933992, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.934025) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 15044043, prev total WAL file size 15044043, number of live WAL files 2.
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.937474) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(2012B)]
Feb 23 09:45:10 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910937576, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10766194, "oldest_snapshot_seqno": -1}
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:11.031823035 +0000 UTC m=+0.061767366 container create 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=)
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9181 keys, 10756894 bytes, temperature: kUnknown
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911033698, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10756894, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10705533, "index_size": 27324, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22981, "raw_key_size": 246612, "raw_average_key_size": 26, "raw_value_size": 10548837, "raw_average_value_size": 1148, "num_data_blocks": 1028, "num_entries": 9181, "num_filter_entries": 9181, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 0, "file_creation_time": 1771839910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.034070) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10756894 bytes
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.036534) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.9 rd, 111.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.3, 0.0 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9706, records dropped: 525 output_compression: NoCompression
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.036564) EVENT_LOG_v1 {"time_micros": 1771839911036552, "job": 4, "event": "compaction_finished", "compaction_time_micros": 96238, "compaction_time_cpu_micros": 30422, "output_level": 6, "num_output_files": 1, "total_output_size": 10756894, "num_input_records": 9706, "num_output_records": 9181, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911038191, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911038299, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.937339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: Started libpod-conmon-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope.
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:11.086390608 +0000 UTC m=+0.116334939 container init 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph)
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f41b365428462e5c4856888ece45a6a6a8335094acb2c9b1536847a1ebabc2b9-merged.mount: Deactivated successfully.
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:10.996380097 +0000 UTC m=+0.026324528 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:11.09917051 +0000 UTC m=+0.129114881 container start 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:11.099481379 +0000 UTC m=+0.129425740 container attach 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Feb 23 09:45:11 np0005626463.localdomain intelligent_nightingale[291461]: 167 167
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: libpod-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope: Deactivated successfully.
Feb 23 09:45:11 np0005626463.localdomain podman[291445]: 2026-02-23 09:45:11.102147061 +0000 UTC m=+0.132091472 container died 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, ceph=True, build-date=2026-02-09T10:25:24Z)
Feb 23 09:45:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:11.173 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-621224875a7cb1321f8e4b66ef64452fcc1770bba9939697bb6eb427bba3927f-merged.mount: Deactivated successfully.
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1266158925' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:11.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:45:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:11.188 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:45:11 np0005626463.localdomain podman[291466]: 2026-02-23 09:45:11.19536536 +0000 UTC m=+0.084351888 container remove 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2, GIT_BRANCH=main, ceph=True, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc.)
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: libpod-conmon-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope: Deactivated successfully.
Feb 23 09:45:11 np0005626463.localdomain sudo[291411]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:11 np0005626463.localdomain sudo[291489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:11 np0005626463.localdomain sudo[291489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:45:11 np0005626463.localdomain sudo[291489]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:11 np0005626463.localdomain sudo[291508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:11 np0005626463.localdomain sudo[291508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:11 np0005626463.localdomain podman[291507]: 2026-02-23 09:45:11.593043186 +0000 UTC m=+0.089096214 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.7, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 09:45:11 np0005626463.localdomain podman[291507]: 2026-02-23 09:45:11.612281556 +0000 UTC m=+0.108334604 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc.)
Feb 23 09:45:11 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.107:0/1059612398' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.108:0/1266158925' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:45:11 np0005626463.localdomain ceph-mon[289530]: from='client.17328 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626459", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:45:11 np0005626463.localdomain podman[291560]: 
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:12.001644367 +0000 UTC m=+0.069899625 container create 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7)
Feb 23 09:45:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope.
Feb 23 09:45:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:12.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:11.968698986 +0000 UTC m=+0.036954284 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:12.076021887 +0000 UTC m=+0.144277145 container init 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main)
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:12.083908189 +0000 UTC m=+0.152163437 container start 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:12.084141546 +0000 UTC m=+0.152396804 container attach 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1770267347, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=)
Feb 23 09:45:12 np0005626463.localdomain reverent_mcclintock[291575]: 167 167
Feb 23 09:45:12 np0005626463.localdomain systemd[1]: libpod-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope: Deactivated successfully.
Feb 23 09:45:12 np0005626463.localdomain podman[291560]: 2026-02-23 09:45:12.088581173 +0000 UTC m=+0.156836471 container died 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:45:12 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1d79617f1932691bbe7c3efa34c0e333cb446b5094a5769143e962411d5d39a6-merged.mount: Deactivated successfully.
Feb 23 09:45:12 np0005626463.localdomain podman[291580]: 2026-02-23 09:45:12.194723778 +0000 UTC m=+0.092588130 container remove 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:45:12 np0005626463.localdomain systemd[1]: libpod-conmon-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope: Deactivated successfully.
Feb 23 09:45:12 np0005626463.localdomain sudo[291508]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:12 np0005626463.localdomain sudo[291603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:12 np0005626463.localdomain sudo[291603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:12 np0005626463.localdomain sudo[291603]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:12 np0005626463.localdomain sudo[291621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:12 np0005626463.localdomain sudo[291621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:12 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 23 09:45:12 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Feb 23 09:45:12 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Feb 23 09:45:12 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@5(peon) e7  my rank is now 4 (was 5)
Feb 23 09:45:12 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 23 09:45:12 np0005626463.localdomain ceph-mon[289530]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:45:12 np0005626463.localdomain ceph-mon[289530]: paxos.4).electionLogic(26) init, last seen epoch 26
Feb 23 09:45:12 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:12 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:13.025171896 +0000 UTC m=+0.073232137 container create 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:45:13 np0005626463.localdomain ceph-mds[286877]: --2- [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] >> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] conn(0x5634b1281c00 0x5634b1423600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Feb 23 09:45:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:13.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: Started libpod-conmon-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope.
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:13.091060267 +0000 UTC m=+0.139120518 container init 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=1770267347, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True)
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:12.997184087 +0000 UTC m=+0.045244388 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: tmp-crun.cWfQTV.mount: Deactivated successfully.
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:13.104191049 +0000 UTC m=+0.152251260 container start 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347)
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:13.104492029 +0000 UTC m=+0.152552310 container attach 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, vendor=Red Hat, Inc., name=rhceph, release=1770267347, RELEASE=main, version=7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:45:13 np0005626463.localdomain bold_agnesi[291672]: 167 167
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: libpod-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope: Deactivated successfully.
Feb 23 09:45:13 np0005626463.localdomain podman[291656]: 2026-02-23 09:45:13.107988496 +0000 UTC m=+0.156048797 container died 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, release=1770267347, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, distribution-scope=public)
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e7744f9b1f0b5c7c8419c1aa6a71833428ba30aa2f214a925005d33bf6b896e4-merged.mount: Deactivated successfully.
Feb 23 09:45:13 np0005626463.localdomain podman[291677]: 2026-02-23 09:45:13.19484957 +0000 UTC m=+0.075041353 container remove 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:45:13 np0005626463.localdomain systemd[1]: libpod-conmon-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope: Deactivated successfully.
Feb 23 09:45:13 np0005626463.localdomain sudo[291621]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:45:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:45:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:45:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:45:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:45:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:14.093 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:45:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:15.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626466 calling monitor election
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 calling monitor election
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626460 calling monitor election
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463 calling monitor election
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626463 in quorum (ranks 0,1,2,4)
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: monmap epoch 7
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: last_changed 2026-02-23T09:45:12.813169+0000
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: min_mon_release 18 (reef)
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: election_strategy: 1
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005626463
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: mgrmap e18: np0005626461.lrfquh(active, since 20s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: Health check failed: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 (MON_DOWN)
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463
Feb 23 09:45:17 np0005626463.localdomain ceph-mon[289530]:     mon.np0005626465 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.505 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.584 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.585 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.787 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.789 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11862MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.789 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.790 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.871 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.872 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.872 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:45:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:18.919 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='client.17334 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626459"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Remove daemons mon.np0005626459
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Safe to remove mon.np0005626459: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'])
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Removing monitor np0005626459 from monmap...
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon rm", "name": "np0005626459"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Removing daemon mon.np0005626459 from np0005626459.localdomain -- ports []
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626465 calling monitor election
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626460 calling monitor election
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 calling monitor election
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626466 calling monitor election
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4)
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: monmap epoch 7
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: last_changed 2026-02-23T09:45:12.813169+0000
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: min_mon_release 18 (reef)
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: election_strategy: 1
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005626463
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: mgrmap e18: np0005626461.lrfquh(active, since 21s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463)
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: Cluster is now healthy
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: overall HEALTH_OK
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:18 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:19 np0005626463.localdomain sudo[291717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:19 np0005626463.localdomain sudo[291717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:45:19 np0005626463.localdomain sudo[291717]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.094 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: tmp-crun.dQZbS7.mount: Deactivated successfully.
Feb 23 09:45:19 np0005626463.localdomain sudo[291756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:19 np0005626463.localdomain sudo[291756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:19 np0005626463.localdomain podman[291755]: 2026-02-23 09:45:19.189012678 +0000 UTC m=+0.097525362 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:45:19 np0005626463.localdomain podman[291754]: 2026-02-23 09:45:19.238969229 +0000 UTC m=+0.147384180 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:45:19 np0005626463.localdomain podman[291754]: 2026-02-23 09:45:19.271087575 +0000 UTC m=+0.179502526 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:45:19 np0005626463.localdomain podman[291755]: 2026-02-23 09:45:19.323313756 +0000 UTC m=+0.231826470 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.382 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.388 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.409 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.412 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:45:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:19.412 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.614492886 +0000 UTC m=+0.075939180 container create ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.42.2)
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: Started libpod-conmon-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope.
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.583490585 +0000 UTC m=+0.044936929 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.684246995 +0000 UTC m=+0.145693289 container init ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-09T10:25:24Z, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.694005725 +0000 UTC m=+0.155452029 container start ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z)
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.694336115 +0000 UTC m=+0.155782409 container attach ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, release=1770267347, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:45:19 np0005626463.localdomain inspiring_euler[291855]: 167 167
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: libpod-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope: Deactivated successfully.
Feb 23 09:45:19 np0005626463.localdomain podman[291840]: 2026-02-23 09:45:19.698116121 +0000 UTC m=+0.159562445 container died ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, distribution-scope=public, release=1770267347)
Feb 23 09:45:19 np0005626463.localdomain podman[291860]: 2026-02-23 09:45:19.792711432 +0000 UTC m=+0.082932115 container remove ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:45:19 np0005626463.localdomain systemd[1]: libpod-conmon-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope: Deactivated successfully.
Feb 23 09:45:19 np0005626463.localdomain sudo[291756]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:19 np0005626463.localdomain sudo[291877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:19 np0005626463.localdomain sudo[291877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:19 np0005626463.localdomain sudo[291877]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='client.? 172.18.0.106:0/3246610862' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:19 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:20 np0005626463.localdomain sudo[291895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:20 np0005626463.localdomain sudo[291895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:20 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-687799a5ff8a64bd19365c3174285223d5ee6279077a6b09507ea30c065cd6a7-merged.mount: Deactivated successfully.
Feb 23 09:45:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:20.412 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.507861514 +0000 UTC m=+0.074755103 container create bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:45:20 np0005626463.localdomain systemd[1]: Started libpod-conmon-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope.
Feb 23 09:45:20 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.57032527 +0000 UTC m=+0.137218859 container init bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.476980277 +0000 UTC m=+0.043873906 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.581760131 +0000 UTC m=+0.148653720 container start bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, io.buildah.version=1.42.2, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.582186494 +0000 UTC m=+0.149080083 container attach bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container)
Feb 23 09:45:20 np0005626463.localdomain naughty_borg[291945]: 167 167
Feb 23 09:45:20 np0005626463.localdomain systemd[1]: libpod-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope: Deactivated successfully.
Feb 23 09:45:20 np0005626463.localdomain podman[291930]: 2026-02-23 09:45:20.586460644 +0000 UTC m=+0.153354303 container died bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, release=1770267347, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:45:20 np0005626463.localdomain podman[291950]: 2026-02-23 09:45:20.677204198 +0000 UTC m=+0.077079656 container remove bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, io.openshift.expose-services=)
Feb 23 09:45:20 np0005626463.localdomain systemd[1]: libpod-conmon-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope: Deactivated successfully.
Feb 23 09:45:20 np0005626463.localdomain sudo[291895]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:21 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b20f6591a2a87ff9dfdddb000e9d30f7310c535f9d9755037bdd6534fcf26088-merged.mount: Deactivated successfully.
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: Removed label mon from host np0005626459.localdomain
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:21 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='client.26731 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: Removed label mgr from host np0005626459.localdomain
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:45:22 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:45:22 np0005626463.localdomain podman[291966]: 2026-02-23 09:45:22.922395423 +0000 UTC m=+0.092600821 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute)
Feb 23 09:45:22 np0005626463.localdomain podman[291966]: 2026-02-23 09:45:22.938427675 +0000 UTC m=+0.108633093 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:45:22 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:45:23 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:24.179 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:24.182 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: Removed label _admin from host np0005626459.localdomain
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:24 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:25 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:26 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:45:26 np0005626463.localdomain podman[291985]: 2026-02-23 09:45:26.918826196 +0000 UTC m=+0.087410702 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:45:26 np0005626463.localdomain podman[291985]: 2026-02-23 09:45:26.954391566 +0000 UTC m=+0.122976062 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:45:26 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:45:27 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:28 np0005626463.localdomain sshd[292001]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:45:28 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.184 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.206 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:29.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:45:29 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:30 np0005626463.localdomain sshd[292001]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:30 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:45:32 np0005626463.localdomain ceph-mon[289530]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:32 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:32 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:33 np0005626463.localdomain sudo[292003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:45:33 np0005626463.localdomain sudo[292003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292003]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:45:33 np0005626463.localdomain sudo[292021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292021]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:33 np0005626463.localdomain sudo[292039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292039]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:33 np0005626463.localdomain sudo[292057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292057]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:33 np0005626463.localdomain sudo[292075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292075]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:33 np0005626463.localdomain sudo[292109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292109]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:45:33 np0005626463.localdomain sudo[292127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292127]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain sudo[292145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292145]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:33 np0005626463.localdomain sudo[292163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292163]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:45:33 np0005626463.localdomain sudo[292181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292181]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain sudo[292199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:33 np0005626463.localdomain sudo[292199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:33 np0005626463.localdomain sudo[292199]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Removing np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:33 np0005626463.localdomain ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:34 np0005626463.localdomain sudo[292217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:34 np0005626463.localdomain sudo[292217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:34 np0005626463.localdomain sudo[292217]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:34 np0005626463.localdomain sudo[292235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:34 np0005626463.localdomain sudo[292235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:34 np0005626463.localdomain sudo[292235]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.208 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:34 np0005626463.localdomain sudo[292269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:34 np0005626463.localdomain sudo[292269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:34 np0005626463.localdomain sudo[292269]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.235 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:34.236 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:34 np0005626463.localdomain sudo[292287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:45:34 np0005626463.localdomain sudo[292287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:34 np0005626463.localdomain sudo[292287]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:34 np0005626463.localdomain sudo[292305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:45:34 np0005626463.localdomain sudo[292305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:34 np0005626463.localdomain sudo[292305]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:34 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='client.26746 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626459.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: Added label _no_schedule to host np0005626459.localdomain
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626459.localdomain
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:35 np0005626463.localdomain ceph-mon[289530]: Removing daemon crash.np0005626459 from np0005626459.localdomain -- ports []
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: from='client.34168 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626459.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005626459"}]': finished
Feb 23 09:45:36 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain sudo[292323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:45:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:45:37 np0005626463.localdomain sudo[292323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:37 np0005626463.localdomain sudo[292323]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: Removing key for client.crash.np0005626459
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: Removing daemon mgr.np0005626459.pmtxxl from np0005626459.localdomain -- ports [9283, 8765]
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"}]': finished
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"}]': finished
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:45:37 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:37 np0005626463.localdomain podman[292340]: 2026-02-23 09:45:37.820985361 +0000 UTC m=+0.092214518 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:45:37 np0005626463.localdomain podman[292340]: 2026-02-23 09:45:37.835657612 +0000 UTC m=+0.106886769 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:45:37 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:45:38 np0005626463.localdomain sudo[292363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:45:38 np0005626463.localdomain sudo[292363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:38 np0005626463.localdomain sudo[292363]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='client.26810 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626459.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: Removed host np0005626459.localdomain
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: Removing key for mgr.np0005626459.pmtxxl
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:38 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.237 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:39.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:39 np0005626463.localdomain sshd[292381]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:45:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:45:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:45:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 23 09:45:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:45:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18267 "" "Go-http-client/1.1"
Feb 23 09:45:39 np0005626463.localdomain sshd[292381]: Accepted publickey for tripleo-admin from 192.168.122.11 port 50738 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 23 09:45:39 np0005626463.localdomain systemd-logind[759]: New session 65 of user tripleo-admin.
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Queued start job for default target Main User Target.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Created slice User Application Slice.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Started Daily Cleanup of User's Temporary Directories.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Reached target Paths.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Reached target Timers.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Starting D-Bus User Message Bus Socket...
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Starting Create User's Volatile Files and Directories...
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Finished Create User's Volatile Files and Directories.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Listening on D-Bus User Message Bus Socket.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Reached target Sockets.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Reached target Basic System.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Reached target Main User Target.
Feb 23 09:45:39 np0005626463.localdomain systemd[292385]: Startup finished in 152ms.
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 23 09:45:39 np0005626463.localdomain systemd[1]: Started Session 65 of User tripleo-admin.
Feb 23 09:45:39 np0005626463.localdomain sshd[292381]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626460 (monmap changed)...
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:39 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:40 np0005626463.localdomain sudo[292525]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qawpdtpyszehymlodyghraegegeubglg ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771839939.8123333-63110-206921215570669/AnsiballZ_lineinfile.py
Feb 23 09:45:40 np0005626463.localdomain sudo[292525]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 09:45:40 np0005626463.localdomain python3[292527]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 23 09:45:40 np0005626463.localdomain sudo[292525]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626460 (monmap changed)...
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:40 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:41 np0005626463.localdomain sudo[292671]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcwigqskruubrtfxcudxpdoyznoielyh ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771839940.6075943-63128-16713613036338/AnsiballZ_command.py
Feb 23 09:45:41 np0005626463.localdomain sudo[292671]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 09:45:41 np0005626463.localdomain python3[292673]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:45:41 np0005626463.localdomain sudo[292671]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:41 np0005626463.localdomain sudo[292816]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hugqjeceikvwfrlegenlxyohmoapudtn ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771839941.3784208-63139-222343491968284/AnsiballZ_command.py
Feb 23 09:45:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:45:41 np0005626463.localdomain sudo[292816]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)...
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:45:41 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:41 np0005626463.localdomain systemd[1]: tmp-crun.aL7tzr.mount: Deactivated successfully.
Feb 23 09:45:41 np0005626463.localdomain podman[292818]: 2026-02-23 09:45:41.872082161 +0000 UTC m=+0.098835752 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., release=1770267347)
Feb 23 09:45:41 np0005626463.localdomain podman[292818]: 2026-02-23 09:45:41.886006659 +0000 UTC m=+0.112760220 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:45:41 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:45:41 np0005626463.localdomain python3[292819]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:45:42 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:45:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:45:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:45:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:45:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:45:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:45:43 np0005626463.localdomain sudo[292816]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.262 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.302 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:44.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:44 np0005626463.localdomain sudo[292860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:44 np0005626463.localdomain sudo[292860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:44 np0005626463.localdomain sudo[292860]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:44 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:45:44 np0005626463.localdomain sudo[292878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:44 np0005626463.localdomain sudo[292878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.847499499 +0000 UTC m=+0.065518351 container create 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, distribution-scope=public, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, release=1770267347)
Feb 23 09:45:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope.
Feb 23 09:45:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.815839847 +0000 UTC m=+0.033858719 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.924989174 +0000 UTC m=+0.143008036 container init 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, version=7, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 23 09:45:44 np0005626463.localdomain peaceful_carson[292929]: 167 167
Feb 23 09:45:44 np0005626463.localdomain systemd[1]: libpod-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope: Deactivated successfully.
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.935696463 +0000 UTC m=+0.153715315 container start 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7)
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.936049484 +0000 UTC m=+0.154068336 container attach 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True)
Feb 23 09:45:44 np0005626463.localdomain podman[292914]: 2026-02-23 09:45:44.937746346 +0000 UTC m=+0.155765248 container died 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc.)
Feb 23 09:45:45 np0005626463.localdomain podman[292934]: 2026-02-23 09:45:45.031628905 +0000 UTC m=+0.083277674 container remove 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=)
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: libpod-conmon-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope: Deactivated successfully.
Feb 23 09:45:45 np0005626463.localdomain sudo[292878]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:45 np0005626463.localdomain sudo[292950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:45 np0005626463.localdomain sudo[292950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:45 np0005626463.localdomain sudo[292950]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:45 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:45 np0005626463.localdomain sudo[292968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:45 np0005626463.localdomain sudo[292968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.72862664 +0000 UTC m=+0.074382251 container create 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc.)
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: Started libpod-conmon-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope.
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.787996081 +0000 UTC m=+0.133751692 container init 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:45:45 np0005626463.localdomain dreamy_davinci[293017]: 167 167
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.799135173 +0000 UTC m=+0.144890774 container start 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.700307542 +0000 UTC m=+0.046063193 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.799628759 +0000 UTC m=+0.145384370 container attach 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: libpod-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope: Deactivated successfully.
Feb 23 09:45:45 np0005626463.localdomain podman[293002]: 2026-02-23 09:45:45.801724542 +0000 UTC m=+0.147480213 container died 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2)
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e05ef13dc817c465026bee3c6ea3b4ca4614671ade21d6fa8fa9378a61130c4f-merged.mount: Deactivated successfully.
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: tmp-crun.OwJLBF.mount: Deactivated successfully.
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c31e04b19edf851b7e7f63f783c207fb50d96460f5ab5c884bc0710e9594cc8d-merged.mount: Deactivated successfully.
Feb 23 09:45:45 np0005626463.localdomain podman[293022]: 2026-02-23 09:45:45.893106085 +0000 UTC m=+0.079532490 container remove 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_BRANCH=main, RELEASE=main, release=1770267347, vendor=Red Hat, Inc.)
Feb 23 09:45:45 np0005626463.localdomain systemd[1]: libpod-conmon-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope: Deactivated successfully.
Feb 23 09:45:46 np0005626463.localdomain sudo[292968]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:46 np0005626463.localdomain sudo[293045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:46 np0005626463.localdomain sudo[293045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:46 np0005626463.localdomain sudo[293045]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:46 np0005626463.localdomain sudo[293063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:46 np0005626463.localdomain sudo[293063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.696560775 +0000 UTC m=+0.057192675 container create c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.42.2, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7)
Feb 23 09:45:46 np0005626463.localdomain systemd[1]: Started libpod-conmon-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope.
Feb 23 09:45:46 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: from='client.34183 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: Saving service mon spec with placement label:mon
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:45:46 np0005626463.localdomain ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.754098589 +0000 UTC m=+0.114730499 container init c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.762740344 +0000 UTC m=+0.123372244 container start c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z)
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.763000412 +0000 UTC m=+0.123632312 container attach c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:45:46 np0005626463.localdomain lucid_mclaren[293113]: 167 167
Feb 23 09:45:46 np0005626463.localdomain systemd[1]: libpod-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope: Deactivated successfully.
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.76553675 +0000 UTC m=+0.126168680 container died c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:45:46 np0005626463.localdomain podman[293098]: 2026-02-23 09:45:46.667740191 +0000 UTC m=+0.028372131 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:46 np0005626463.localdomain sshd[293129]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:45:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3639d597589c16367452ce06da7a6a717d37521643186a666de3899ac3abb8d9-merged.mount: Deactivated successfully.
Feb 23 09:45:46 np0005626463.localdomain podman[293118]: 2026-02-23 09:45:46.864068993 +0000 UTC m=+0.086546636 container remove c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, release=1770267347)
Feb 23 09:45:46 np0005626463.localdomain systemd[1]: libpod-conmon-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope: Deactivated successfully.
Feb 23 09:45:47 np0005626463.localdomain sudo[293063]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:47 np0005626463.localdomain sudo[293141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:47 np0005626463.localdomain sudo[293141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:47 np0005626463.localdomain sudo[293141]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:47 np0005626463.localdomain sudo[293159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:47 np0005626463.localdomain sudo[293159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:47 np0005626463.localdomain sshd[293129]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.716558636 +0000 UTC m=+0.072778663 container create 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2)
Feb 23 09:45:47 np0005626463.localdomain systemd[1]: Started libpod-conmon-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope.
Feb 23 09:45:47 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e4f20 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 23 09:45:47 np0005626463.localdomain ceph-mon[289530]: mon.np0005626463@4(peon) e8  removed from monmap, suicide.
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.686494764 +0000 UTC m=+0.042714831 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:47 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.799895952 +0000 UTC m=+0.156115979 container init 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2)
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.808834766 +0000 UTC m=+0.165054793 container start 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, RELEASE=main, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.809104324 +0000 UTC m=+0.165324361 container attach 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:45:47 np0005626463.localdomain sudo[293209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:47 np0005626463.localdomain silly_merkle[293212]: 167 167
Feb 23 09:45:47 np0005626463.localdomain systemd[1]: libpod-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope: Deactivated successfully.
Feb 23 09:45:47 np0005626463.localdomain sudo[293209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:47 np0005626463.localdomain podman[293194]: 2026-02-23 09:45:47.819033308 +0000 UTC m=+0.175253385 container died 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, vcs-type=git, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2)
Feb 23 09:45:47 np0005626463.localdomain sudo[293209]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:47 np0005626463.localdomain podman[293228]: 2026-02-23 09:45:47.857663263 +0000 UTC m=+0.065728526 container died 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1770267347, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 23 09:45:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c-merged.mount: Deactivated successfully.
Feb 23 09:45:47 np0005626463.localdomain sudo[293247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 --name mon.np0005626463 --force
Feb 23 09:45:47 np0005626463.localdomain podman[293228]: 2026-02-23 09:45:47.894227494 +0000 UTC m=+0.102292717 container remove 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, version=7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:45:47 np0005626463.localdomain sudo[293247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:47 np0005626463.localdomain systemd[1]: tmp-crun.b2stwl.mount: Deactivated successfully.
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-540c513eff830757a4769e809eb48cf9cd28d0b44842a9010f5253fd2d0aa49d-merged.mount: Deactivated successfully.
Feb 23 09:45:48 np0005626463.localdomain podman[293241]: 2026-02-23 09:45:48.035125496 +0000 UTC m=+0.206536255 container remove 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: libpod-conmon-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope: Deactivated successfully.
Feb 23 09:45:48 np0005626463.localdomain sudo[293159]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:45:48.547 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:45:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:45:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:45:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:45:48.550 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46@mon.np0005626463.service: Deactivated successfully.
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: Stopped Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46@mon.np0005626463.service: Consumed 3.836s CPU time.
Feb 23 09:45:48 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:45:48 np0005626463.localdomain systemd-sysv-generator[293410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:45:48 np0005626463.localdomain systemd-rc-local-generator[293406]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:45:49 np0005626463.localdomain sudo[293247]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:49.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:45:49 np0005626463.localdomain podman[293416]: 2026-02-23 09:45:49.92521666 +0000 UTC m=+0.095718626 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:45:49 np0005626463.localdomain podman[293416]: 2026-02-23 09:45:49.966257769 +0000 UTC m=+0.136759745 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:45:49 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:45:50 np0005626463.localdomain podman[293417]: 2026-02-23 09:45:50.015356014 +0000 UTC m=+0.185160539 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:45:50 np0005626463.localdomain podman[293417]: 2026-02-23 09:45:50.04878704 +0000 UTC m=+0.218591525 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:45:50 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:45:52 np0005626463.localdomain ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors
Feb 23 09:45:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:45:53 np0005626463.localdomain systemd[1]: tmp-crun.Iy5Xn6.mount: Deactivated successfully.
Feb 23 09:45:53 np0005626463.localdomain podman[293465]: 2026-02-23 09:45:53.921652333 +0000 UTC m=+0.093926922 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:45:53 np0005626463.localdomain podman[293465]: 2026-02-23 09:45:53.969383006 +0000 UTC m=+0.141657605 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:45:53 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.340 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:54.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:56 np0005626463.localdomain ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors
Feb 23 09:45:56 np0005626463.localdomain sudo[293484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:45:56 np0005626463.localdomain sudo[293484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:56 np0005626463.localdomain sudo[293484]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:56 np0005626463.localdomain sudo[293502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:45:56 np0005626463.localdomain sudo[293502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.269083841 +0000 UTC m=+0.062606792 container create f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, GIT_CLEAN=True, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z)
Feb 23 09:45:57 np0005626463.localdomain podman[293537]: 2026-02-23 09:45:57.277940052 +0000 UTC m=+0.086029369 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: Started libpod-conmon-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope.
Feb 23 09:45:57 np0005626463.localdomain podman[293537]: 2026-02-23 09:45:57.308236511 +0000 UTC m=+0.116325848 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.23940161 +0000 UTC m=+0.032924561 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.349891399 +0000 UTC m=+0.143414350 container init f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.42.2, version=7, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.362507165 +0000 UTC m=+0.156030126 container start f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1770267347, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64)
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.362910018 +0000 UTC m=+0.156432979 container attach f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., release=1770267347, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, version=7, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:45:57 np0005626463.localdomain eloquent_poincare[293571]: 167 167
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: libpod-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope: Deactivated successfully.
Feb 23 09:45:57 np0005626463.localdomain podman[293545]: 2026-02-23 09:45:57.367723806 +0000 UTC m=+0.161246837 container died f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, release=1770267347, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Feb 23 09:45:57 np0005626463.localdomain podman[293577]: 2026-02-23 09:45:57.45949785 +0000 UTC m=+0.082564353 container remove f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:45:57 np0005626463.localdomain systemd[1]: libpod-conmon-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope: Deactivated successfully.
Feb 23 09:45:57 np0005626463.localdomain sudo[293502]: pam_unix(sudo:session): session closed for user root
Feb 23 09:45:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bd2efb1e912333087bd01200a4454987b2f49d206f6677cded7970d3308eca30-merged.mount: Deactivated successfully.
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.375 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.377 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:45:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:45:59.417 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:00 np0005626463.localdomain sshd[293596]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:04.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:07 np0005626463.localdomain sshd[293596]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:46:08 np0005626463.localdomain sudo[293598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:08 np0005626463.localdomain sudo[293598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:46:08 np0005626463.localdomain sudo[293598]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:08 np0005626463.localdomain sudo[293617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:08 np0005626463.localdomain sudo[293617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:08 np0005626463.localdomain systemd[1]: tmp-crun.7bXG0j.mount: Deactivated successfully.
Feb 23 09:46:08 np0005626463.localdomain podman[293616]: 2026-02-23 09:46:08.497992088 +0000 UTC m=+0.111028456 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:46:08 np0005626463.localdomain podman[293616]: 2026-02-23 09:46:08.511327177 +0000 UTC m=+0.124363595 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:46:08 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:46:08 np0005626463.localdomain sudo[293672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:08 np0005626463.localdomain sudo[293672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:08 np0005626463.localdomain sudo[293672]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:08 np0005626463.localdomain sudo[293707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:46:08 np0005626463.localdomain sudo[293707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:08 np0005626463.localdomain podman[293734]: 
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:09.004056218 +0000 UTC m=+0.062547350 container create 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2026-02-09T10:25:24Z, ceph=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: Started libpod-conmon-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope.
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:09.072739744 +0000 UTC m=+0.131230906 container init 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, release=1770267347, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main)
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:08.975590026 +0000 UTC m=+0.034081178 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:09.083503774 +0000 UTC m=+0.141994936 container start 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git)
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:09.083790083 +0000 UTC m=+0.142281275 container attach 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=)
Feb 23 09:46:09 np0005626463.localdomain eager_torvalds[293749]: 167 167
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: libpod-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain podman[293734]: 2026-02-23 09:46:09.087699074 +0000 UTC m=+0.146190306 container died 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:46:09 np0005626463.localdomain podman[293756]: 2026-02-23 09:46:09.170454821 +0000 UTC m=+0.073843015 container remove 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2026-02-09T10:25:24Z, release=1770267347, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main)
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: libpod-conmon-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.284023154 +0000 UTC m=+0.073508896 container create cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2)
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: Started libpod-conmon-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope.
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:09 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:09 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:09 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:09 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.329064296 +0000 UTC m=+0.118550028 container init cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1770267347)
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.337142404 +0000 UTC m=+0.126628136 container start cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.337386381 +0000 UTC m=+0.126872123 container attach cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.253539889 +0000 UTC m=+0.043025621 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:46:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:46:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:46:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155222 "" "Go-http-client/1.1"
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: libpod-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.459 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.462 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.462 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.463 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8c3ef492b3e983313cf8b1437f5b28f51c387bcf24144573a1df52680e91fef4-merged.mount: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.515 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:09.516 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:09 np0005626463.localdomain podman[293772]: 2026-02-23 09:46:09.563512265 +0000 UTC m=+0.352997997 container died cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, build-date=2026-02-09T10:25:24Z, vcs-type=git, version=7, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=)
Feb 23 09:46:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:46:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18085 "" "Go-http-client/1.1"
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d-merged.mount: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain podman[293836]: 2026-02-23 09:46:09.702158708 +0000 UTC m=+0.272773317 container remove cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1770267347, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True)
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: libpod-conmon-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope: Deactivated successfully.
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:46:09 np0005626463.localdomain systemd-sysv-generator[293892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:46:09 np0005626463.localdomain systemd-rc-local-generator[293888]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:09 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: Reloading.
Feb 23 09:46:10 np0005626463.localdomain systemd-rc-local-generator[293957]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 23 09:46:10 np0005626463.localdomain systemd-sysv-generator[293964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 23 09:46:10 np0005626463.localdomain podman[293968]: 2026-02-23 09:46:10.353352178 +0000 UTC m=+0.082442249 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:46:10 np0005626463.localdomain podman[293968]: 2026-02-23 09:46:10.458680208 +0000 UTC m=+0.187770299 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph)
Feb 23 09:46:10 np0005626463.localdomain systemd[1]: Starting Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46...
Feb 23 09:46:10 np0005626463.localdomain podman[294100]: 
Feb 23 09:46:10 np0005626463.localdomain podman[294100]: 2026-02-23 09:46:10.928706843 +0000 UTC m=+0.207946768 container create a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:46:10 np0005626463.localdomain podman[294100]: 2026-02-23 09:46:10.849150943 +0000 UTC m=+0.128390858 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff)
Feb 23 09:46:10 np0005626463.localdomain podman[294100]: 2026-02-23 09:46:10.98796053 +0000 UTC m=+0.267200445 container init a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Feb 23 09:46:10 np0005626463.localdomain podman[294100]: 2026-02-23 09:46:10.996653607 +0000 UTC m=+0.275893522 container start a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, release=1770267347, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 23 09:46:10 np0005626463.localdomain bash[294100]: a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753
Feb 23 09:46:11 np0005626463.localdomain systemd[1]: Started Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46.
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: set uid:gid to 167:167 (ceph:ceph)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pidfile_write: ignore empty --pid-file
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: load: jerasure load: lrc 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: RocksDB version: 7.9.2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Git sha 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: DB SUMMARY
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: DB Session ID:  66DAQ76CBLV8DSGL8JC7
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: CURRENT file:  CURRENT
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: IDENTITY file:  IDENTITY
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005626463/store.db dir, Total Num: 0, files: 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005626463/store.db: 000004.log size: 886 ; 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                         Options.error_if_exists: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.create_if_missing: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                         Options.paranoid_checks: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                                     Options.env: 0x5609fa701a20
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                                Options.info_log: 0x5609fbabcd20
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.max_file_opening_threads: 16
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                              Options.statistics: (nil)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                               Options.use_fsync: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.max_log_file_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                         Options.allow_fallocate: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.use_direct_reads: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.create_missing_column_families: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                              Options.db_log_dir: 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                                 Options.wal_dir: 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.advise_random_on_open: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                    Options.write_buffer_manager: 0x5609fbacd540
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                            Options.rate_limiter: (nil)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.unordered_write: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                               Options.row_cache: None
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                              Options.wal_filter: None
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.allow_ingest_behind: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.two_write_queues: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.manual_wal_flush: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.wal_compression: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.atomic_flush: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.log_readahead_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.allow_data_in_errors: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.db_host_id: __hostname__
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.max_background_jobs: 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.max_background_compactions: -1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.max_subcompactions: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.max_total_wal_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                          Options.max_open_files: -1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                          Options.bytes_per_sync: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:       Options.compaction_readahead_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.max_background_flushes: -1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Compression algorithms supported:
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kZSTD supported: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kXpressCompression supported: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kBZip2Compression supported: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kLZ4Compression supported: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kZlibCompression supported: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kLZ4HCCompression supported: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         kSnappyCompression supported: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:           Options.merge_operator: 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:        Options.compaction_filter: None
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:        Options.compaction_filter_factory: None
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:  Options.sst_partitioner_factory: None
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5609fbabc980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x5609fbab9350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:        Options.write_buffer_size: 33554432
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:  Options.max_write_buffer_number: 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.compression: NoCompression
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:       Options.prefix_extractor: nullptr
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.num_levels: 7
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.compression_opts.level: 32767
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:               Options.compression_opts.strategy: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                  Options.compression_opts.enabled: false
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.arena_block_size: 1048576
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.disable_auto_compactions: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.table_properties_collectors: 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.inplace_update_support: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                           Options.bloom_locality: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                    Options.max_successive_merges: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.paranoid_file_checks: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.force_consistency_checks: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.report_bg_io_stats: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                               Options.ttl: 2592000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                       Options.enable_blob_files: false
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                           Options.min_blob_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                          Options.blob_file_size: 268435456
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb:                Options.blob_file_starting_level: 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4cfd6c8f-aafa-4003-b2f6-d22c49635dd4
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971046733, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971050071, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971050180, "job": 1, "event": "recovery_finished"}
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 23 09:46:11 np0005626463.localdomain sudo[293617]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5609fbae0e00
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: DB pointer 0x5609fbbd6000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 does not exist in monmap, will attempt to join an existing cluster
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.96 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.96 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.5e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,1.08 KB,0.000205636%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: starting mon.np0005626463 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005626463 fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(???) e0 preinit fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing) e8 sync_obtain_latest_monmap
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8
Feb 23 09:46:11 np0005626463.localdomain sudo[293707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:11 np0005626463.localdomain sudo[294201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:11 np0005626463.localdomain sudo[294201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:11 np0005626463.localdomain sudo[294201]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:11 np0005626463.localdomain sudo[294219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:46:11 np0005626463.localdomain sudo[294219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).mds e17 new map
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-23T07:57:46.097663+0000
                                                           modified        2026-02-23T09:43:29.529267+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26518}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26518 members: 26518
                                                           [mds.mds.np0005626463.qcthuc{0:26518} state up:active seq 13 addr [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005626465.drvnoy{-1:26498} state up:standby seq 1 addr [v2:172.18.0.107:6808/2939113664,v1:172.18.0.107:6809/2939113664] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005626466.vaywlp{-1:26506} state up:standby seq 1 addr [v2:172.18.0.108:6808/2035422599,v1:172.18.0.108:6809/2035422599] compat {c=[1],r=[1],i=[17ff]}]
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 3314933000854323200, adjusting msgr requires
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.17280 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3622865160' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4034397128' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1059612398' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1266158925' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.17328 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626459", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626460 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626463 in quorum (ranks 0,1,2,4)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: monmap epoch 7
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:45:12.813169+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005626463
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mgrmap e18: np0005626461.lrfquh(active, since 20s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 (MON_DOWN)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]:     mon.np0005626465 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.17334 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626459"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626459
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626459: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'])
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626459 from monmap...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon rm", "name": "np0005626459"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626459 from np0005626459.localdomain -- ports []
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626460 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: monmap epoch 7
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:45:12.813169+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005626463
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mgrmap e18: np0005626461.lrfquh(active, since 21s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Cluster is now healthy
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3246610862' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removed label mon from host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.26731 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removed label mgr from host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626459.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removed label _admin from host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.26746 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626459.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Added label _no_schedule to host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing daemon crash.np0005626459 from np0005626459.localdomain -- ports []
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.34168 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626459.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005626459"}]': finished
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing key for client.crash.np0005626459
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing daemon mgr.np0005626459.pmtxxl from np0005626459.localdomain -- ports [9283, 8765]
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"}]': finished
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"}]': finished
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.26810 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626459.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removed host np0005626459.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing key for mgr.np0005626459.pmtxxl
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626460 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.34183 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626463"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626463
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626463: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465'])
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626463 from monmap...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626463 from np0005626463.localdomain -- ports []
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626460 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626465 in quorum (ranks 0,2,3)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465 in quorum (ranks 0,1,2,3)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: monmap epoch 8
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:45:47.745502+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mgrmap e18: np0005626461.lrfquh(active, since 59s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2077027104' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2077027104' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.26777 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005626463.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: Deploying daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1048189808' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(synchronizing).paxosservice(auth 1..37) refresh upgraded, format 0 -> 3
Feb 23 09:46:11 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 23 09:46:11 np0005626463.localdomain sudo[294219]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:46:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id
Feb 23 09:46:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id
Feb 23 09:46:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.743 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.743 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.744 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:46:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:12.744 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:46:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:46:12 np0005626463.localdomain podman[294268]: 2026-02-23 09:46:12.906280132 +0000 UTC m=+0.074515397 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:46:12 np0005626463.localdomain podman[294268]: 2026-02-23 09:46:12.920827128 +0000 UTC m=+0.089062453 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:46:12 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:46:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:46:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:46:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:46:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:46:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:46:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:46:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@-1(probing) e9  my rank is now 4 (was -1)
Feb 23 09:46:13 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:46:13 np0005626463.localdomain ceph-mon[294160]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 23 09:46:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:46:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:13.766 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:46:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:13.791 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:46:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:13.792 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:14.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:15.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.053 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.073 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.075 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:46:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:16.075 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mgrc update_daemon_metadata mon.np0005626463 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005626463.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005626463.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626460 calling monitor election
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4)
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: monmap epoch 9
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:46:11.458117+0000
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626460
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: osdmap e81: 6 total, 6 up, 6 in
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mgrmap e18: np0005626461.lrfquh(active, since 79s), standbys: np0005626460.fyrady, np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:46:16 np0005626463.localdomain sudo[294298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:46:16 np0005626463.localdomain sudo[294298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:16 np0005626463.localdomain sudo[294298]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:16 np0005626463.localdomain sudo[294316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:46:16 np0005626463.localdomain sudo[294316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:16 np0005626463.localdomain sudo[294316]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:16 np0005626463.localdomain sudo[294334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:16 np0005626463.localdomain sudo[294334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:16 np0005626463.localdomain sudo[294334]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:16 np0005626463.localdomain sudo[294352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:16 np0005626463.localdomain sudo[294352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:16 np0005626463.localdomain sudo[294352]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:16 np0005626463.localdomain sudo[294379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:16 np0005626463.localdomain sudo[294379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:16 np0005626463.localdomain sudo[294379]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain sudo[294413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294413]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.180 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:46:17 np0005626463.localdomain sudo[294431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294431]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.243 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.244 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:46:17 np0005626463.localdomain sudo[294451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain sudo[294451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294451]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain sudo[294469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:17 np0005626463.localdomain sudo[294469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294469]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain sudo[294487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:17 np0005626463.localdomain sudo[294487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294487]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.479 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.481 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11836MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.481 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:46:17 np0005626463.localdomain sudo[294505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294505]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.592 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.593 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.593 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:46:17 np0005626463.localdomain sudo[294523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:17 np0005626463.localdomain sudo[294523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294523]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3889230115' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:17.677 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:46:17 np0005626463.localdomain sudo[294541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294541]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id
Feb 23 09:46:17 np0005626463.localdomain sudo[294576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294576]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain sudo[294613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:17 np0005626463.localdomain sudo[294613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294613]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:17 np0005626463.localdomain sudo[294631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:17 np0005626463.localdomain sudo[294631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:17 np0005626463.localdomain sudo[294631]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:18.111 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:46:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:18.117 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:46:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:18.138 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:46:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:18.140 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:46:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:18.141 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:46:18 np0005626463.localdomain sudo[294651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:46:18 np0005626463.localdomain sudo[294651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:18 np0005626463.localdomain sudo[294651]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1966853983' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4291909963' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.142 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.585 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.585 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:19.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)...
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3292769561' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:46:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)...
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:46:20 np0005626463.localdomain podman[294670]: 2026-02-23 09:46:20.955709059 +0000 UTC m=+0.089379022 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:46:20 np0005626463.localdomain podman[294670]: 2026-02-23 09:46:20.964252631 +0000 UTC m=+0.097922634 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:46:20 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:46:21 np0005626463.localdomain podman[294669]: 2026-02-23 09:46:21.050799216 +0000 UTC m=+0.184703916 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:46:21 np0005626463.localdomain podman[294669]: 2026-02-23 09:46:21.092261438 +0000 UTC m=+0.226166118 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Feb 23 09:46:21 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:46:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:22 np0005626463.localdomain sudo[294716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:22 np0005626463.localdomain sudo[294716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:22 np0005626463.localdomain sudo[294716]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:22 np0005626463.localdomain sudo[294734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:22 np0005626463.localdomain sudo[294734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:22 np0005626463.localdomain podman[294769]: 
Feb 23 09:46:22 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:22.910648593 +0000 UTC m=+0.077072475 container create 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, ceph=True, io.buildah.version=1.42.2, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Feb 23 09:46:22 np0005626463.localdomain systemd[1]: Started libpod-conmon-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope.
Feb 23 09:46:22 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:22.879405605 +0000 UTC m=+0.045829517 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:22 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:22 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:22.995219286 +0000 UTC m=+0.161643168 container init 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:46:23 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:23.00543047 +0000 UTC m=+0.171854322 container start 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, GIT_CLEAN=True, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347)
Feb 23 09:46:23 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:23.005592825 +0000 UTC m=+0.172016747 container attach 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 09:46:23 np0005626463.localdomain competent_mclaren[294784]: 167 167
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: libpod-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope: Deactivated successfully.
Feb 23 09:46:23 np0005626463.localdomain podman[294769]: 2026-02-23 09:46:23.010193525 +0000 UTC m=+0.176617417 container died 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, build-date=2026-02-09T10:25:24Z, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:46:23 np0005626463.localdomain podman[294789]: 2026-02-23 09:46:23.117385143 +0000 UTC m=+0.097024057 container remove 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: libpod-conmon-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope: Deactivated successfully.
Feb 23 09:46:23 np0005626463.localdomain sudo[294734]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:23 np0005626463.localdomain sudo[294806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:23 np0005626463.localdomain sudo[294806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:23 np0005626463.localdomain sudo[294806]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/3475178154' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:46:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:23 np0005626463.localdomain sudo[294824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:23 np0005626463.localdomain sudo[294824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.718604061 +0000 UTC m=+0.065750407 container create 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1770267347, RELEASE=main, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z)
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: Started libpod-conmon-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope.
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.783700868 +0000 UTC m=+0.130847214 container init 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, ceph=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.42.2)
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.688422566 +0000 UTC m=+0.035568952 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.793297372 +0000 UTC m=+0.140443718 container start 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.7935646 +0000 UTC m=+0.140710986 container attach 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:46:23 np0005626463.localdomain nifty_cartwright[294873]: 167 167
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: libpod-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope: Deactivated successfully.
Feb 23 09:46:23 np0005626463.localdomain podman[294858]: 2026-02-23 09:46:23.796247432 +0000 UTC m=+0.143393818 container died 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main)
Feb 23 09:46:23 np0005626463.localdomain podman[294878]: 2026-02-23 09:46:23.901872222 +0000 UTC m=+0.096819541 container remove 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: libpod-conmon-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope: Deactivated successfully.
Feb 23 09:46:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-26e4ca7a38c877c345ec4abd06c52a942e386c17a66ca1c6efc4467254a1c811-merged.mount: Deactivated successfully.
Feb 23 09:46:24 np0005626463.localdomain sudo[294824]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:24 np0005626463.localdomain sudo[294902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:24 np0005626463.localdomain sudo[294902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:46:24 np0005626463.localdomain sudo[294902]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:24 np0005626463.localdomain sudo[294921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:24 np0005626463.localdomain sudo[294921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:24 np0005626463.localdomain podman[294920]: 2026-02-23 09:46:24.394575602 +0000 UTC m=+0.099371989 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, config_id=ceilometer_agent_compute)
Feb 23 09:46:24 np0005626463.localdomain podman[294920]: 2026-02-23 09:46:24.410199371 +0000 UTC m=+0.114995738 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.588 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:24.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: Reconfig service osd.default_drive_group
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:46:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 2026-02-23 09:46:24.802474741 +0000 UTC m=+0.074743223 container create ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, release=1770267347)
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: Started libpod-conmon-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope.
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 2026-02-23 09:46:24.772270714 +0000 UTC m=+0.044539236 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 2026-02-23 09:46:24.871606391 +0000 UTC m=+0.143874873 container init ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True)
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 2026-02-23 09:46:24.88199026 +0000 UTC m=+0.154258792 container start ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, RELEASE=main, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Feb 23 09:46:24 np0005626463.localdomain podman[294975]: 2026-02-23 09:46:24.882437214 +0000 UTC m=+0.154705746 container attach ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, architecture=x86_64, release=1770267347)
Feb 23 09:46:24 np0005626463.localdomain quirky_vaughan[294990]: 167 167
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: libpod-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope: Deactivated successfully.
Feb 23 09:46:24 np0005626463.localdomain podman[294995]: 2026-02-23 09:46:24.955389501 +0000 UTC m=+0.051667266 container died ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, RELEASE=main, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph)
Feb 23 09:46:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-86e1915b5c9b3973cde7e615ce4bf05e45f2467ef3f3bda96317b67dac1cd812-merged.mount: Deactivated successfully.
Feb 23 09:46:24 np0005626463.localdomain podman[294995]: 2026-02-23 09:46:24.99580942 +0000 UTC m=+0.092087145 container remove ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:46:25 np0005626463.localdomain systemd[1]: libpod-conmon-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope: Deactivated successfully.
Feb 23 09:46:25 np0005626463.localdomain sudo[294921]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:25 np0005626463.localdomain sudo[295019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:25 np0005626463.localdomain sudo[295019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:25 np0005626463.localdomain sudo[295019]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:25 np0005626463.localdomain sudo[295037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:25 np0005626463.localdomain sudo[295037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2634313896' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 e82: 6 total, 6 up, 6 in
Feb 23 09:46:25 np0005626463.localdomain sshd[290268]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:46:25 np0005626463.localdomain systemd-logind[759]: Session 64 logged out. Waiting for processes to exit.
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.673673) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985673763, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10902, "num_deletes": 256, "total_data_size": 15180226, "memory_usage": 15533664, "flush_reason": "Manual Compaction"}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985729908, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13027101, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10907, "table_properties": {"data_size": 12968467, "index_size": 31904, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25669, "raw_key_size": 271477, "raw_average_key_size": 26, "raw_value_size": 12792918, "raw_average_value_size": 1247, "num_data_blocks": 1226, "num_entries": 10255, "num_filter_entries": 10255, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 1771839971, "file_creation_time": 1771839985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 56428 microseconds, and 29493 cpu microseconds.
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.730101) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13027101 bytes OK
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.730157) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732000) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732028) EVENT_LOG_v1 {"time_micros": 1771839985732021, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732119) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 15105492, prev total WAL file size 15106296, number of live WAL files 2.
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.735191) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(2012B)]
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985735311, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13029113, "oldest_snapshot_seqno": -1}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10003 keys, 13023826 bytes, temperature: kUnknown
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985809913, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 13023826, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12965832, "index_size": 31909, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25029, "raw_key_size": 266594, "raw_average_key_size": 26, "raw_value_size": 12793573, "raw_average_value_size": 1278, "num_data_blocks": 1225, "num_entries": 10003, "num_filter_entries": 10003, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771839985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.810292) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 13023826 bytes
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.812188) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.4 rd, 174.3 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.4, 0.0 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10260, records dropped: 257 output_compression: NoCompression
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.812227) EVENT_LOG_v1 {"time_micros": 1771839985812210, "job": 4, "event": "compaction_finished", "compaction_time_micros": 74713, "compaction_time_cpu_micros": 38717, "output_level": 6, "num_output_files": 1, "total_output_size": 13023826, "num_input_records": 10260, "num_output_records": 10003, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985814939, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985815033, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.735078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' 
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2634313896' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: Activating manager daemon np0005626460.fyrady
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: osdmap e82: 6 total, 6 up, 6 in
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: mgrmap e19: np0005626460.fyrady(active, starting, since 0.11399s), standbys: np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626459.pmtxxl", "id": "np0005626459.pmtxxl"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: Manager daemon np0005626460.fyrady is now available
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"}]': finished
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch
Feb 23 09:46:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"}]': finished
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.861567301 +0000 UTC m=+0.081036706 container create df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:46:25 np0005626463.localdomain systemd[1]: Started libpod-conmon-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope.
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.829491768 +0000 UTC m=+0.048961213 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:25 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.947469146 +0000 UTC m=+0.166938531 container init df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:46:25 np0005626463.localdomain sshd[295091]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:46:25 np0005626463.localdomain systemd[1]: libpod-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope: Deactivated successfully.
Feb 23 09:46:25 np0005626463.localdomain affectionate_boyd[295088]: 167 167
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.960551517 +0000 UTC m=+0.180020932 container start df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=)
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.960846546 +0000 UTC m=+0.180315971 container attach df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True)
Feb 23 09:46:25 np0005626463.localdomain podman[295073]: 2026-02-23 09:46:25.963515628 +0000 UTC m=+0.182985063 container died df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:46:26 np0005626463.localdomain sshd[295091]: Accepted publickey for ceph-admin from 192.168.122.104 port 47372 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:46:26 np0005626463.localdomain systemd-logind[759]: New session 67 of user ceph-admin.
Feb 23 09:46:26 np0005626463.localdomain podman[295095]: 2026-02-23 09:46:26.05616274 +0000 UTC m=+0.085757342 container remove df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, ceph=True, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1019644620 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:26 np0005626463.localdomain systemd[1]: Started Session 67 of User ceph-admin.
Feb 23 09:46:26 np0005626463.localdomain systemd[1]: libpod-conmon-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope: Deactivated successfully.
Feb 23 09:46:26 np0005626463.localdomain sshd[295091]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:46:26 np0005626463.localdomain sudo[295037]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:26 np0005626463.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Feb 23 09:46:26 np0005626463.localdomain systemd[1]: session-64.scope: Consumed 24.553s CPU time.
Feb 23 09:46:26 np0005626463.localdomain systemd-logind[759]: Removed session 64.
Feb 23 09:46:26 np0005626463.localdomain sudo[295113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:26 np0005626463.localdomain sudo[295113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:26 np0005626463.localdomain sudo[295113]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:26 np0005626463.localdomain sudo[295131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:46:26 np0005626463.localdomain sudo[295131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: removing stray HostCache host record np0005626459.localdomain.devices.0
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/mirror_snapshot_schedule"} : dispatch
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/mirror_snapshot_schedule"} : dispatch
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/trash_purge_schedule"} : dispatch
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/trash_purge_schedule"} : dispatch
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: mgrmap e20: np0005626460.fyrady(active, since 1.1267s), standbys: np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:26 np0005626463.localdomain ceph-mon[294160]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:26 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a90b42d502dd51f8451e7a369fbadfad49a814ef823e8e68fce767d255c472e3-merged.mount: Deactivated successfully.
Feb 23 09:46:27 np0005626463.localdomain systemd[1]: tmp-crun.G4IdEQ.mount: Deactivated successfully.
Feb 23 09:46:27 np0005626463.localdomain podman[295220]: 2026-02-23 09:46:27.084478206 +0000 UTC m=+0.093892419 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Feb 23 09:46:27 np0005626463.localdomain podman[295220]: 2026-02-23 09:46:27.182715926 +0000 UTC m=+0.192130199 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.42.2)
Feb 23 09:46:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:46:27 np0005626463.localdomain podman[295272]: 2026-02-23 09:46:27.504594872 +0000 UTC m=+0.135474994 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:46:27 np0005626463.localdomain podman[295272]: 2026-02-23 09:46:27.537250536 +0000 UTC m=+0.168130638 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 09:46:27 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:26] ENGINE Bus STARTING
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:26] ENGINE Serving on http://172.18.0.104:8765
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Serving on https://172.18.0.104:7150
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Bus STARTED
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Client ('172.18.0.104', 55702) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: Cluster is now healthy
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: mgrmap e21: np0005626460.fyrady(active, since 2s), standbys: np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:27 np0005626463.localdomain sudo[295131]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:28 np0005626463.localdomain sudo[295358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:28 np0005626463.localdomain sudo[295358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:28 np0005626463.localdomain sudo[295358]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:28 np0005626463.localdomain sudo[295376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:46:28 np0005626463.localdomain sudo[295376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:28 np0005626463.localdomain sudo[295376]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:28 np0005626463.localdomain sudo[295427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:28 np0005626463.localdomain sudo[295427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:28 np0005626463.localdomain sudo[295427]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:28 np0005626463.localdomain sudo[295445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:46:28 np0005626463.localdomain sudo[295445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295445]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:29 np0005626463.localdomain sudo[295481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:46:29 np0005626463.localdomain sudo[295481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295481]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.632 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.635 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.636 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.636 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:29.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:29 np0005626463.localdomain sudo[295499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:46:29 np0005626463.localdomain sudo[295499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295499]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:29 np0005626463.localdomain sudo[295517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:29 np0005626463.localdomain sudo[295517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295517]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:29 np0005626463.localdomain sudo[295535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:29 np0005626463.localdomain sudo[295535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295535]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:29 np0005626463.localdomain sudo[295553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:29 np0005626463.localdomain sudo[295553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:29 np0005626463.localdomain sudo[295553]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295587]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295605]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain sudo[295623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295623]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:30 np0005626463.localdomain sudo[295641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295641]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:30 np0005626463.localdomain sudo[295659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295659]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:30 np0005626463.localdomain ceph-mon[294160]: mgrmap e22: np0005626460.fyrady(active, since 4s), standbys: np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl
Feb 23 09:46:30 np0005626463.localdomain sudo[295677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295677]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:30 np0005626463.localdomain sudo[295695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295695]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295713]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295747]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295765]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:30 np0005626463.localdomain sudo[295783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295783]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:46:30 np0005626463.localdomain sudo[295801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295801]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:46:30 np0005626463.localdomain sudo[295819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295819]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:30 np0005626463.localdomain sudo[295837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:30 np0005626463.localdomain sudo[295837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:30 np0005626463.localdomain sudo[295837]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:31 np0005626463.localdomain sudo[295855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295855]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020046648 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:31 np0005626463.localdomain sudo[295873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:31 np0005626463.localdomain sudo[295873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295873]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:31 np0005626463.localdomain sudo[295907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295907]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:31 np0005626463.localdomain sudo[295925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295925]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:31 np0005626463.localdomain sudo[295943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295943]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:31 np0005626463.localdomain sudo[295961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295961]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:31 np0005626463.localdomain ceph-mon[294160]: Standby manager daemon np0005626461.lrfquh started
Feb 23 09:46:31 np0005626463.localdomain sudo[295979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:31 np0005626463.localdomain sudo[295979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295979]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[295997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:31 np0005626463.localdomain sudo[295997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[295997]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[296015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:31 np0005626463.localdomain sudo[296015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[296015]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:31 np0005626463.localdomain sudo[296033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:31 np0005626463.localdomain sudo[296033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:31 np0005626463.localdomain sudo[296033]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:33 np0005626463.localdomain sudo[296067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:33 np0005626463.localdomain sudo[296067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:33 np0005626463.localdomain sudo[296067]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:33 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain sudo[296085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:33 np0005626463.localdomain sudo[296085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:33 np0005626463.localdomain sudo[296085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:33 np0005626463.localdomain sudo[296103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:33 np0005626463.localdomain sudo[296103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:33 np0005626463.localdomain sudo[296103]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:34 np0005626463.localdomain sudo[296121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:46:34 np0005626463.localdomain sudo[296121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:34 np0005626463.localdomain sudo[296121]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.641 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: mgrmap e23: np0005626460.fyrady(active, since 8s), standbys: np0005626465.hlpkwo, np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl, np0005626461.lrfquh
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:34.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:34 np0005626463.localdomain sshd[296139]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:46:34 np0005626463.localdomain sudo[296140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:34 np0005626463.localdomain sudo[296140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:34 np0005626463.localdomain sudo[296140]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:34 np0005626463.localdomain sudo[296158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:34 np0005626463.localdomain sudo[296158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:35 np0005626463.localdomain sshd[296189]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:46:35 np0005626463.localdomain sshd[296139]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.440989756 +0000 UTC m=+0.085657269 container create 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347)
Feb 23 09:46:35 np0005626463.localdomain systemd[1]: Started libpod-conmon-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope.
Feb 23 09:46:35 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.406712653 +0000 UTC m=+0.051380166 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.512031847 +0000 UTC m=+0.156699350 container init 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container)
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.523207457 +0000 UTC m=+0.167874960 container start 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.523471265 +0000 UTC m=+0.168138818 container attach 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, RELEASE=main, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Feb 23 09:46:35 np0005626463.localdomain cool_gagarin[296210]: 167 167
Feb 23 09:46:35 np0005626463.localdomain systemd[1]: libpod-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope: Deactivated successfully.
Feb 23 09:46:35 np0005626463.localdomain podman[296195]: 2026-02-23 09:46:35.528656163 +0000 UTC m=+0.173323686 container died 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Feb 23 09:46:35 np0005626463.localdomain podman[296215]: 2026-02-23 09:46:35.617672843 +0000 UTC m=+0.080155921 container remove 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Feb 23 09:46:35 np0005626463.localdomain systemd[1]: libpod-conmon-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope: Deactivated successfully.
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:35 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:46:35 np0005626463.localdomain sudo[296158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:35 np0005626463.localdomain sshd[296189]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:46:35 np0005626463.localdomain sudo[296232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:35 np0005626463.localdomain sudo[296232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:35 np0005626463.localdomain sudo[296232]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:35 np0005626463.localdomain sudo[296250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:35 np0005626463.localdomain sudo[296250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054580 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.331644403 +0000 UTC m=+0.078417698 container create 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: Started libpod-conmon-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope.
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.393312389 +0000 UTC m=+0.140085724 container init 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.297900696 +0000 UTC m=+0.044674011 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.400499748 +0000 UTC m=+0.147273043 container start 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, release=1770267347)
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.400939971 +0000 UTC m=+0.147713266 container attach 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:46:36 np0005626463.localdomain exciting_swanson[296300]: 167 167
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: libpod-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope: Deactivated successfully.
Feb 23 09:46:36 np0005626463.localdomain podman[296284]: 2026-02-23 09:46:36.404404137 +0000 UTC m=+0.151177452 container died 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c1e744a8ad4bd2d9f8790c7b7d1756aecd0c747ece9a113d10c94dcad058ee51-merged.mount: Deactivated successfully.
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b93adf770a490ffdcc678b074d950fbad1c73cb20c852b53bdf2231751fdb3be-merged.mount: Deactivated successfully.
Feb 23 09:46:36 np0005626463.localdomain podman[296305]: 2026-02-23 09:46:36.511687112 +0000 UTC m=+0.090986690 container remove 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1770267347, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Feb 23 09:46:36 np0005626463.localdomain systemd[1]: libpod-conmon-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope: Deactivated successfully.
Feb 23 09:46:36 np0005626463.localdomain sudo[296250]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:46:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='client.26940 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:46:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:46:38 np0005626463.localdomain podman[296322]: 2026-02-23 09:46:38.92434721 +0000 UTC m=+0.090351570 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:46:38 np0005626463.localdomain podman[296322]: 2026-02-23 09:46:38.935666975 +0000 UTC m=+0.101671365 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:46:38 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:46:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:46:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:46:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:46:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:46:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:46:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1"
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.700 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:39.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='client.34257 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='client.44139 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626463", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2879459597' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:46:42 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:46:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:46:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:46:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:46:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:46:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:46:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:46:43 np0005626463.localdomain sshd[292400]: Received disconnect from 192.168.122.11 port 50738:11: disconnected by user
Feb 23 09:46:43 np0005626463.localdomain sshd[292400]: Disconnected from user tripleo-admin 192.168.122.11 port 50738
Feb 23 09:46:43 np0005626463.localdomain sshd[292381]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 23 09:46:43 np0005626463.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Feb 23 09:46:43 np0005626463.localdomain systemd[1]: session-65.scope: Consumed 1.746s CPU time.
Feb 23 09:46:43 np0005626463.localdomain systemd-logind[759]: Session 65 logged out. Waiting for processes to exit.
Feb 23 09:46:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:46:43 np0005626463.localdomain systemd-logind[759]: Removed session 65.
Feb 23 09:46:43 np0005626463.localdomain podman[296345]: 2026-02-23 09:46:43.553994643 +0000 UTC m=+0.094672003 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, version=9.7, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, architecture=x86_64, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:46:43 np0005626463.localdomain podman[296345]: 2026-02-23 09:46:43.56639855 +0000 UTC m=+0.107075890 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:46:43 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.792 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:44.793 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/145607622' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 23 09:46:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:46:48.548 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:46:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:46:48.548 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:46:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:46:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' 
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:48 np0005626463.localdomain sudo[296364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:46:48 np0005626463.localdomain sudo[296364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:48 np0005626463.localdomain sudo[296364]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 e83: 6 total, 6 up, 6 in
Feb 23 09:46:49 np0005626463.localdomain sshd[295091]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:46:49 np0005626463.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Feb 23 09:46:49 np0005626463.localdomain systemd[1]: session-67.scope: Consumed 7.147s CPU time.
Feb 23 09:46:49 np0005626463.localdomain systemd-logind[759]: Session 67 logged out. Waiting for processes to exit.
Feb 23 09:46:49 np0005626463.localdomain systemd-logind[759]: Removed session 67.
Feb 23 09:46:49 np0005626463.localdomain sshd[296382]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:46:49 np0005626463.localdomain sshd[296382]: Accepted publickey for ceph-admin from 192.168.122.107 port 57658 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:46:49 np0005626463.localdomain systemd-logind[759]: New session 68 of user ceph-admin.
Feb 23 09:46:49 np0005626463.localdomain systemd[1]: Started Session 68 of User ceph-admin.
Feb 23 09:46:49 np0005626463.localdomain sshd[296382]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:46:49 np0005626463.localdomain sudo[296386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:49 np0005626463.localdomain sudo[296386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:49 np0005626463.localdomain sudo[296386]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:49 np0005626463.localdomain sudo[296404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:46:49 np0005626463.localdomain sudo[296404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/3611724471' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: Activating manager daemon np0005626465.hlpkwo
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: osdmap e83: 6 total, 6 up, 6 in
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: mgrmap e24: np0005626465.hlpkwo(active, starting, since 0.0537193s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl, np0005626461.lrfquh
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626460"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626459.pmtxxl", "id": "np0005626459.pmtxxl"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: Manager daemon np0005626465.hlpkwo is now available
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.793 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.796 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:49.848 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:50 np0005626463.localdomain podman[296490]: 2026-02-23 09:46:50.494316251 +0000 UTC m=+0.091537906 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:46:50 np0005626463.localdomain podman[296490]: 2026-02-23 09:46:50.627481424 +0000 UTC m=+0.224703089 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Feb 23 09:46:50 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 23 09:46:50 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:50.973543) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:46:50 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 23 09:46:50 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840010973606, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1523, "num_deletes": 255, "total_data_size": 8165950, "memory_usage": 8638144, "flush_reason": "Manual Compaction"}
Feb 23 09:46:50 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011008524, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5001842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10912, "largest_seqno": 12430, "table_properties": {"data_size": 4994876, "index_size": 3855, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17591, "raw_average_key_size": 21, "raw_value_size": 4979849, "raw_average_value_size": 6147, "num_data_blocks": 160, "num_entries": 810, "num_filter_entries": 810, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839985, "oldest_key_time": 1771839985, "file_creation_time": 1771840010, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 35055 microseconds, and 10182 cpu microseconds.
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.008600) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5001842 bytes OK
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.008628) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011159) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011184) EVENT_LOG_v1 {"time_micros": 1771840011011177, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011208) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 8157902, prev total WAL file size 8166459, number of live WAL files 2.
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.012806) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303231' seq:72057594037927935, type:22 .. '6B760031323734' seq:0, type:0; will stop at (end)
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(4884KB)], [15(12MB)]
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011012954, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18025668, "oldest_snapshot_seqno": -1}
Feb 23 09:46:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: mgrmap e25: np0005626465.hlpkwo(active, since 1.06936s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626459.pmtxxl, np0005626461.lrfquh
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: mgrmap e26: np0005626465.hlpkwo(active, since 2s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626461.lrfquh
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10291 keys, 16995268 bytes, temperature: kUnknown
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011168429, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16995268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16935229, "index_size": 33250, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25733, "raw_key_size": 275322, "raw_average_key_size": 26, "raw_value_size": 16757620, "raw_average_value_size": 1628, "num_data_blocks": 1263, "num_entries": 10291, "num_filter_entries": 10291, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840011, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.168745) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16995268 bytes
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.170699) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 115.9 rd, 109.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 12.4 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(7.0) write-amplify(3.4) OK, records in: 10813, records dropped: 522 output_compression: NoCompression
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.170718) EVENT_LOG_v1 {"time_micros": 1771840011170709, "job": 6, "event": "compaction_finished", "compaction_time_micros": 155573, "compaction_time_cpu_micros": 38908, "output_level": 6, "num_output_files": 1, "total_output_size": 16995268, "num_input_records": 10813, "num_output_records": 10291, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011171570, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011173093, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.012657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173152) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:46:51 np0005626463.localdomain podman[296581]: 2026-02-23 09:46:51.187144498 +0000 UTC m=+0.103943815 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:46:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:46:51 np0005626463.localdomain podman[296581]: 2026-02-23 09:46:51.200498653 +0000 UTC m=+0.117297970 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:46:51 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:46:51 np0005626463.localdomain podman[296618]: 2026-02-23 09:46:51.291134432 +0000 UTC m=+0.085950477 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:46:51 np0005626463.localdomain podman[296618]: 2026-02-23 09:46:51.347350643 +0000 UTC m=+0.142166688 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:46:51 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:46:51 np0005626463.localdomain sudo[296404]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:51 np0005626463.localdomain sudo[296659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:51 np0005626463.localdomain sudo[296659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:51 np0005626463.localdomain sudo[296659]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:51 np0005626463.localdomain sudo[296677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:46:51 np0005626463.localdomain sudo[296677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:52 np0005626463.localdomain sudo[296677]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:52 np0005626463.localdomain sudo[296727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:52 np0005626463.localdomain sudo[296727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:52 np0005626463.localdomain sudo[296727]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:52 np0005626463.localdomain sudo[296745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:46:52 np0005626463.localdomain sudo[296745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:52 np0005626463.localdomain sudo[296745]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Bus STARTING
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Serving on https://172.18.0.107:7150
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Client ('172.18.0.107', 34506) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Serving on http://172.18.0.107:8765
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Bus STARTED
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: Cluster is now healthy
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:53 np0005626463.localdomain sudo[296781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:46:53 np0005626463.localdomain sudo[296781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain sudo[296781]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain sudo[296799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:46:53 np0005626463.localdomain sudo[296799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Activating special unit Exit the Session...
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped target Main User Target.
Feb 23 09:46:53 np0005626463.localdomain sudo[296799]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped target Basic System.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped target Paths.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped target Sockets.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped target Timers.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Closed D-Bus User Message Bus Socket.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Stopped Create User's Volatile Files and Directories.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Removed slice User Application Slice.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Reached target Shutdown.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Finished Exit the Session.
Feb 23 09:46:53 np0005626463.localdomain systemd[292385]: Reached target Exit the Session.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 23 09:46:53 np0005626463.localdomain systemd[1]: user-1003.slice: Consumed 2.336s CPU time.
Feb 23 09:46:53 np0005626463.localdomain sudo[296818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:53 np0005626463.localdomain sudo[296818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain sudo[296818]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain sudo[296836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:53 np0005626463.localdomain sudo[296836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain sudo[296836]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain sudo[296854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:53 np0005626463.localdomain sudo[296854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain sudo[296854]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:53 np0005626463.localdomain sudo[296888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:53 np0005626463.localdomain sudo[296888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:53 np0005626463.localdomain sudo[296888]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[296906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:46:54 np0005626463.localdomain sudo[296906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[296906]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[296924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:46:54 np0005626463.localdomain sudo[296924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[296924]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[296942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:54 np0005626463.localdomain sudo[296942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[296942]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:46:54 np0005626463.localdomain ceph-mon[294160]: Standby manager daemon np0005626460.fyrady started
Feb 23 09:46:54 np0005626463.localdomain sudo[296960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:54 np0005626463.localdomain sudo[296960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[296960]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[296978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:54 np0005626463.localdomain sudo[296978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[296978]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[296996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:54 np0005626463.localdomain sudo[296996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:46:54 np0005626463.localdomain sudo[296996]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[297019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:54 np0005626463.localdomain sudo[297019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[297019]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain podman[297014]: 2026-02-23 09:46:54.525722437 +0000 UTC m=+0.074538509 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute)
Feb 23 09:46:54 np0005626463.localdomain podman[297014]: 2026-02-23 09:46:54.563185848 +0000 UTC m=+0.112001910 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:46:54 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:46:54 np0005626463.localdomain sudo[297067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:54 np0005626463.localdomain sudo[297067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[297067]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[297085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:46:54 np0005626463.localdomain sudo[297085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[297085]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[297103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:54 np0005626463.localdomain sudo[297103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[297103]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain sudo[297121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:46:54 np0005626463.localdomain sudo[297121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.848 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:54 np0005626463.localdomain sudo[297121]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.851 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.851 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.852 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:54.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:54 np0005626463.localdomain sudo[297139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:46:54 np0005626463.localdomain sudo[297139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:54 np0005626463.localdomain sudo[297139]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297157]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:55 np0005626463.localdomain sudo[297175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297175]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297193]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: mgrmap e27: np0005626465.hlpkwo(active, since 5s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:46:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch
Feb 23 09:46:55 np0005626463.localdomain sudo[297227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297227]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297245]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:55 np0005626463.localdomain sudo[297263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297263]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:55 np0005626463.localdomain sudo[297281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297281]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:46:55 np0005626463.localdomain sudo[297299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297299]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297317]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:55 np0005626463.localdomain sudo[297335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297335]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:55 np0005626463.localdomain sudo[297353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:55 np0005626463.localdomain sudo[297353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:55 np0005626463.localdomain sudo[297353]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:56 np0005626463.localdomain sudo[297387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:56 np0005626463.localdomain sudo[297387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:56 np0005626463.localdomain sudo[297387]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.144 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f6a5be-6717-43be-99c5-5d4ccece8b90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.135243', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971db5a0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '32f1929f900d4aea43a62c479d5a5028f8af349febca1f926653e0ffa498df5d'}]}, 'timestamp': '2026-02-23 09:46:56.144828', '_unique_id': 'f402c5c58aa441398695826817720c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.147 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7afda6ea-bf69-4f7a-bb24-1ac48eb7e4e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.147810', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971e40d8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '817a2d413328e026eccd4cc22afd0b1f63dd73010c37aeed7149f41e94f6cde9'}]}, 'timestamp': '2026-02-23 09:46:56.148335', '_unique_id': 'dee426847d754399b8157bd121d3b41f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain sudo[297405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c87ace0-8350-4f5a-b336-cb99c696cbe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.150782', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971eb48c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '49f55398390563c331f7215bccd333c40c4f1422c97541d69a3d1ecb2ca50db4'}]}, 'timestamp': '2026-02-23 09:46:56.151293', '_unique_id': 'bf8d63f508c648fa8cbbc20b65fe98db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain sudo[297405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.153 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '130d72d6-8ca4-4459-982b-ddf8e6464072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.153467', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971f1bd4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '08df1b73a50b7d7f0f5947a1ac1ea3baacf9547b29ad625a0344a260b5513a9d'}]}, 'timestamp': '2026-02-23 09:46:56.154050', '_unique_id': '3c7e245b36db45bebfb88d358c922049'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain sudo[297405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.188 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a87d629-b26d-4c9d-bd9d-3cf94b97aa71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.156851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '97247f8e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'c60245ed4a15df92172704893f3c34b63df78aa94b426b511fe586539d53d87a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.156851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '97249424-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'fbfd08862eaaa809c8ee7260dbe63af02128df8fc4308f5267f8d4a619bf012c'}]}, 'timestamp': '2026-02-23 09:46:56.189774', '_unique_id': '4f0ab4a436d94f73ba01ea1a6a391697'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c41c68c0-fbd5-4193-a945-f4dacf831605', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.193051', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972528b2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'b67fbaba54ed604ad87e6e21746330a6c24658b47652865d25d649c1a40d0f65'}]}, 'timestamp': '2026-02-23 09:46:56.193682', '_unique_id': 'b979d4bfd7194ec0aa228fe7ea015995'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3f99d46-0645-47bc-b50e-fcbbff405822', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.196449', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '9725ab98-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '2bf74b9ef3e0dffec9d3da48446269b01652b43f0f664765a11699ab4b362618'}]}, 'timestamp': '2026-02-23 09:46:56.197009', '_unique_id': 'e2c05bcb21b8449c881f8edb7fd779ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e525eb79-3459-4112-a2de-2f99eec15dcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.199278', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972619f2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'f098cbd53e94351a71422e088f33773ea0762d113a4c4b8a51d077873dd24357'}]}, 'timestamp': '2026-02-23 09:46:56.199766', '_unique_id': 'c803b1a057c34fd48b7ade36c6adef90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.202 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef2f0845-bc0e-4265-853c-150b72a8b49d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:46:56.202229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '97292b42-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.408451054, 'message_signature': '0d2021a7a4ff6615072b1996dc7ac21e40c64008886ac04cda2bd2a97a3de307'}]}, 'timestamp': '2026-02-23 09:46:56.219963', '_unique_id': '20b89140c85341fdad567e0b7caedf7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e259485-a02b-4f77-bb1d-bfe28bafa3bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.223022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9729ba3a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '0aa7afac3bf008618a70d6b83f5346f46b2e4770104ca9e5a1c365331e8914b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.223022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9729ccbe-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3d75ee7651d95319f974d25bd12fe40e7534dbabcf3d15afbc377c925b014548'}]}, 'timestamp': '2026-02-23 09:46:56.224040', '_unique_id': '000b6d0fbb8e4c5a8b94b217b5af3684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11bad915-f399-4537-b46b-a5a13026bf05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.226360', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972a3c6c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'c0c8d224ed1376f72983b07656dc4430c75d594dc35e4d7783b0cf192c0a23a5'}]}, 'timestamp': '2026-02-23 09:46:56.226859', '_unique_id': '29c23c526fed4282987afaeaf49b5909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain sudo[297423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deb476a7-2403-4782-ad84-c3cb23feee35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.229463', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972ab8e0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'ed4c1510ae0720b419b8c623afd3d9e98738ed8f4b619b1bab4a58be329e7d15'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.229463', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972ad4e2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'a7ae7cdad8370c765d34adae1f9b2449299c4e21b11b7f2c56eab468cfa10664'}]}, 'timestamp': '2026-02-23 09:46:56.230805', '_unique_id': '938fec49ee2c4015b76f024f73322a31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain sudo[297423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0560c791-c276-4fde-b8bb-389bdf4796db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.234385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972b77ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'ac3b900e4df4a2561f9182b9550bac38f5beb64a8fc00dd31222affef44baa08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.234385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972b8c0c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'e588c37614ba5678030da1f150fb822f6d48b5901e07a15c57474166ea640dbd'}]}, 'timestamp': '2026-02-23 09:46:56.235431', '_unique_id': 'b9f1734974de4b4796341244602bab96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain sudo[297423]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ac21f0c-818c-4429-8225-8b9ded36df9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.238181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972c11a4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3167be254df7d66328cc07ac4184b53a68920976f2d80d882a46640d21343e18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.238181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972c2af4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '094795e4dc6b1001a1fe44e2d6e5fd88ecc522d51424e79ec593c15f647c2086'}]}, 'timestamp': '2026-02-23 09:46:56.239569', '_unique_id': '6d4af4d0b0f543f5a5ea11230ec104de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7f1246-0795-4b32-972c-b11f3c7dac82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.242189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972e2fc0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '91946ebf0a00c888fd005c5a944f32215d1fdda8856a67a128ea251b749acf76'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.242189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972e43b6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '8da4676c95cb48ff037cc1458874c715ea21565d63d0bc6ae6e362cfc5ef5282'}]}, 'timestamp': '2026-02-23 09:46:56.253250', '_unique_id': '1f475e546ffb4b9cba144bec0183a89d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4de55d29-983b-4880-9bf4-5a651ebbb937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.255633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972eb468-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'c60f9fdcf203eff2df5745dabb324c7a20fe2d190256e2653f77754da4b2e61e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.255633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972ecbb0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '285e0014d9dc2d155728e923af62f2c37253e5fde9c0e533021f21eec41eea37'}]}, 'timestamp': '2026-02-23 09:46:56.256709', '_unique_id': '00cf6494071d4c6aa076b9e923258916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 10890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee662f1b-7a04-4380-aa82-7c95446b7810', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10890000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:46:56.259247', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '972f4040-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.408451054, 'message_signature': '456f70bcf4c26d25f381f9b4452bfc880f446ddfe7fb60290ccc488b59f20c10'}]}, 'timestamp': '2026-02-23 09:46:56.259708', '_unique_id': '844c38cae9fe43f1bae020364771735e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf0e9db4-8810-4000-9610-599a2914ce9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.262022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972fac92-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3c26ae9337b9e03afa22ef13fc4df14b15cd580f84af52ab68e7df0f9bff11a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.262022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972fbebc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '9d109338901a97533fd29e5949364a5a54eebdb439facbf05ba3d8c932e3ac5e'}]}, 'timestamp': '2026-02-23 09:46:56.262983', '_unique_id': '54bd4531a4e249f7a8bb48a11a062650'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '018be324-7fd1-4161-b80d-728d6a1885fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.265411', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '973030ea-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '470894224921996d5c1b155f6c58765098121eb09103d46915e622a45489f78e'}]}, 'timestamp': '2026-02-23 09:46:56.265930', '_unique_id': '58cbb17171f84b1b92322e3743027f8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45442e7c-3697-4d37-a5ef-796e473e062d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.268144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '97309b3e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'dd397b966d9036ca9dfdf723a97985611892d21772da08121f5d0b0efd6e8a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.268144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9730ab92-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'a4792ee911e37f8fbf641fa38313229fc94d575c8f5f016b238f1ce4d4cf5897'}]}, 'timestamp': '2026-02-23 09:46:56.269062', '_unique_id': 'b2347a7fd7b34a508fccf5ef4b4845be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa08527-a3bc-40ad-b9a5-ab1d9e49be59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.271333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '9731180c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'b32890d6dd8e2ab8368c64732e9aa9fa4f73e7b1fbfba5b23afb2c8688ff2aed'}]}, 'timestamp': '2026-02-23 09:46:56.271799', '_unique_id': '128c08b855b94a0b9380170e4403c402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:46:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:56 np0005626463.localdomain sudo[297441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:46:56 np0005626463.localdomain sudo[297441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:56 np0005626463.localdomain sudo[297441]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626460 (monmap changed)...
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain
Feb 23 09:46:57 np0005626463.localdomain ceph-mon[294160]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:46:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:46:57 np0005626463.localdomain systemd[1]: tmp-crun.YaecEV.mount: Deactivated successfully.
Feb 23 09:46:57 np0005626463.localdomain podman[297459]: 2026-02-23 09:46:57.924181351 +0000 UTC m=+0.088500265 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible)
Feb 23 09:46:57 np0005626463.localdomain podman[297459]: 2026-02-23 09:46:57.936252248 +0000 UTC m=+0.100571162 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:46:57 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:46:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:58 np0005626463.localdomain sudo[297479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:46:58 np0005626463.localdomain sudo[297479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:58 np0005626463.localdomain sudo[297479]: pam_unix(sudo:session): session closed for user root
Feb 23 09:46:58 np0005626463.localdomain sudo[297497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:46:58 np0005626463.localdomain sudo[297497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.226499726 +0000 UTC m=+0.046053072 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.76315338 +0000 UTC m=+0.582706676 container create ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main)
Feb 23 09:46:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:46:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:46:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:46:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:46:59 np0005626463.localdomain systemd[1]: Started libpod-conmon-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope.
Feb 23 09:46:59 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.826033773 +0000 UTC m=+0.645587089 container init ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.839694259 +0000 UTC m=+0.659247585 container start ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.840180863 +0000 UTC m=+0.659734179 container attach ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git)
Feb 23 09:46:59 np0005626463.localdomain goofy_dirac[297544]: 167 167
Feb 23 09:46:59 np0005626463.localdomain systemd[1]: libpod-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope: Deactivated successfully.
Feb 23 09:46:59 np0005626463.localdomain podman[297531]: 2026-02-23 09:46:59.843559407 +0000 UTC m=+0.663112783 container died ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.42.2, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:46:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:46:59.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:46:59 np0005626463.localdomain podman[297549]: 2026-02-23 09:46:59.975828552 +0000 UTC m=+0.121765717 container remove ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, vcs-type=git, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:46:59 np0005626463.localdomain systemd[1]: libpod-conmon-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope: Deactivated successfully.
Feb 23 09:47:00 np0005626463.localdomain sudo[297497]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:00 np0005626463.localdomain sudo[297564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:47:00 np0005626463.localdomain sudo[297564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:00 np0005626463.localdomain sudo[297564]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:00 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5e1f596d40901f80883bc707021c73cb009df1f5f19e7aed28a829b48a673fdd-merged.mount: Deactivated successfully.
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:47:02 np0005626463.localdomain ceph-mon[294160]: from='client.44159 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:47:02 np0005626463.localdomain ceph-mon[294160]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:47:03 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:47:03 np0005626463.localdomain ceph-mon[294160]: from='client.44169 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626460", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1830357104' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1830357104' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:47:04 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5080 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 23 09:47:04 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 23 09:47:04 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@4(peon) e10  my rank is now 3 (was 4)
Feb 23 09:47:04 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: paxos.3).electionLogic(42) init, last seen epoch 42
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.919 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.922 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:04.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626460
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626460: new quorum should be ['np0005626461', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626466', 'np0005626465', 'np0005626463'])
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626460 from monmap...
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon rm", "name": "np0005626460"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626460 from np0005626460.localdomain -- ports []
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3)
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: monmap epoch 10
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:47:04.504308+0000
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005626465
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: osdmap e83: 6 total, 6 up, 6 in
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: mgrmap e27: np0005626465.hlpkwo(active, since 17s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:47:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:06 np0005626463.localdomain sudo[297582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:47:06 np0005626463.localdomain sudo[297582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:06 np0005626463.localdomain sudo[297582]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:06 np0005626463.localdomain sudo[297600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:47:06 np0005626463.localdomain sudo[297600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:06 np0005626463.localdomain sudo[297600]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:06 np0005626463.localdomain sudo[297618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:06 np0005626463.localdomain sudo[297618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:06 np0005626463.localdomain sudo[297618]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:06 np0005626463.localdomain sudo[297636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:06 np0005626463.localdomain sudo[297636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:06 np0005626463.localdomain sudo[297636]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:06 np0005626463.localdomain sshd[297658]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:47:06 np0005626463.localdomain sudo[297654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:06 np0005626463.localdomain sudo[297654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:06 np0005626463.localdomain sudo[297654]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain sudo[297725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297725]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:47:07 np0005626463.localdomain sudo[297743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297743]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:47:07 np0005626463.localdomain sudo[297761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297761]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:07 np0005626463.localdomain sudo[297797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297797]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297815]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:07 np0005626463.localdomain sudo[297849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297849]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:07 np0005626463.localdomain sudo[297867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297867]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:07 np0005626463.localdomain sudo[297885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:07 np0005626463.localdomain sudo[297885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:07 np0005626463.localdomain sudo[297885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:08 np0005626463.localdomain sudo[297903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:47:08 np0005626463.localdomain sudo[297903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:08 np0005626463.localdomain sudo[297903]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:08 np0005626463.localdomain ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='client.44187 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626460.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: Removed label mon from host np0005626460.localdomain
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:47:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:47:09 np0005626463.localdomain sshd[297658]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:47:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:47:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:47:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:47:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:47:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18265 "" "Go-http-client/1.1"
Feb 23 09:47:09 np0005626463.localdomain podman[297922]: 2026-02-23 09:47:09.528686393 +0000 UTC m=+0.086010919 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:47:09 np0005626463.localdomain podman[297922]: 2026-02-23 09:47:09.543318888 +0000 UTC m=+0.100643404 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:47:09 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)...
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:09.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:09.975 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:09.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:09.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:10.020 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:10.021 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.950973) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030951032, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1161, "num_deletes": 261, "total_data_size": 2570671, "memory_usage": 2632496, "flush_reason": "Manual Compaction"}
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030963319, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1548075, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12435, "largest_seqno": 13591, "table_properties": {"data_size": 1542467, "index_size": 2823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14474, "raw_average_key_size": 21, "raw_value_size": 1530230, "raw_average_value_size": 2290, "num_data_blocks": 119, "num_entries": 668, "num_filter_entries": 668, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840011, "oldest_key_time": 1771840011, "file_creation_time": 1771840030, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 12400 microseconds, and 3235 cpu microseconds.
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.963377) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1548075 bytes OK
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.963404) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967328) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967356) EVENT_LOG_v1 {"time_micros": 1771840030967349, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967379) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 2564455, prev total WAL file size 2564779, number of live WAL files 2.
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.971305) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353133' seq:72057594037927935, type:22 .. '6C6F676D0033373639' seq:0, type:0; will stop at (end)
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1511KB)], [18(16MB)]
Feb 23 09:47:10 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030971359, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18543343, "oldest_snapshot_seqno": -1}
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10402 keys, 18390887 bytes, temperature: kUnknown
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031123657, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18390887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18328410, "index_size": 35368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 279469, "raw_average_key_size": 26, "raw_value_size": 18147236, "raw_average_value_size": 1744, "num_data_blocks": 1353, "num_entries": 10402, "num_filter_entries": 10402, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840030, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.124082) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18390887 bytes
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.128356) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.6 rd, 120.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 16.2 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(23.9) write-amplify(11.9) OK, records in: 10959, records dropped: 557 output_compression: NoCompression
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.128388) EVENT_LOG_v1 {"time_micros": 1771840031128374, "job": 8, "event": "compaction_finished", "compaction_time_micros": 152446, "compaction_time_cpu_micros": 51845, "output_level": 6, "num_output_files": 1, "total_output_size": 18390887, "num_input_records": 10959, "num_output_records": 10402, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031128924, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031131506, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.971199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='client.26996 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626460.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: Removed label mgr from host np0005626460.localdomain
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)...
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3987578730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2103941566' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:12.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:12.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:47:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:12.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626460.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: Removed label _admin from host np0005626460.localdomain
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:13.003 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:47:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:47:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:47:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:47:13 np0005626463.localdomain sudo[297945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:13 np0005626463.localdomain sudo[297945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:13 np0005626463.localdomain sudo[297945]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/630921377' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/287116646' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:13 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:47:13 np0005626463.localdomain sudo[297963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:13 np0005626463.localdomain sudo[297963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:47:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:47:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:47:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:47:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:47:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:47:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:47:13 np0005626463.localdomain podman[297996]: 2026-02-23 09:47:13.833365335 +0000 UTC m=+0.085273087 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 23 09:47:13 np0005626463.localdomain podman[297996]: 2026-02-23 09:47:13.84830854 +0000 UTC m=+0.100216262 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 09:47:13 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:47:13 np0005626463.localdomain podman[298004]: 
Feb 23 09:47:13 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:13.93342385 +0000 UTC m=+0.163879639 container create 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, version=7, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Feb 23 09:47:13 np0005626463.localdomain systemd[1]: Started libpod-conmon-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope.
Feb 23 09:47:14 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:13.907777089 +0000 UTC m=+0.138232868 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:14.006 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:14 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:14.02478019 +0000 UTC m=+0.255235969 container init 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=1770267347, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main)
Feb 23 09:47:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:14.034 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:47:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:14.034 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:47:14 np0005626463.localdomain cool_rhodes[298035]: 167 167
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: libpod-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope: Deactivated successfully.
Feb 23 09:47:14 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:14.03823484 +0000 UTC m=+0.268690629 container start 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True)
Feb 23 09:47:14 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:14.038484058 +0000 UTC m=+0.268939857 container attach 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:47:14 np0005626463.localdomain podman[298004]: 2026-02-23 09:47:14.040646844 +0000 UTC m=+0.271102623 container died 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64)
Feb 23 09:47:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:14.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:14.073 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:14 np0005626463.localdomain podman[298040]: 2026-02-23 09:47:14.132129377 +0000 UTC m=+0.084934455 container remove 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1770267347, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: libpod-conmon-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope: Deactivated successfully.
Feb 23 09:47:14 np0005626463.localdomain sudo[297963]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:14 np0005626463.localdomain sudo[298056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:14 np0005626463.localdomain sudo[298056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:14 np0005626463.localdomain sudo[298056]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:14 np0005626463.localdomain sudo[298074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:14 np0005626463.localdomain sudo[298074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.804536392 +0000 UTC m=+0.074447526 container create 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8cffeeb07020634fd5a1dcbf109b156005d2467919d0716ab65955ebeb38f74e-merged.mount: Deactivated successfully.
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: Started libpod-conmon-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope.
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.775608902 +0000 UTC m=+0.045520076 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.883703891 +0000 UTC m=+0.153615015 container init 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2026-02-09T10:25:24Z)
Feb 23 09:47:14 np0005626463.localdomain vigorous_carver[298124]: 167 167
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.895572023 +0000 UTC m=+0.165483157 container start 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, version=7)
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.895965155 +0000 UTC m=+0.165876319 container attach 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, ceph=True, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, RELEASE=main)
Feb 23 09:47:14 np0005626463.localdomain systemd[1]: libpod-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope: Deactivated successfully.
Feb 23 09:47:14 np0005626463.localdomain podman[298108]: 2026-02-23 09:47:14.900752151 +0000 UTC m=+0.170663285 container died 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:47:15 np0005626463.localdomain podman[298129]: 2026-02-23 09:47:15.050625402 +0000 UTC m=+0.144209150 container remove 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.057 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.057 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:15 np0005626463.localdomain systemd[1]: libpod-conmon-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope: Deactivated successfully.
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:15.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:15 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:47:15 np0005626463.localdomain sudo[298074]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:15 np0005626463.localdomain sudo[298154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:15 np0005626463.localdomain sudo[298154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:15 np0005626463.localdomain sudo[298154]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:15 np0005626463.localdomain sudo[298172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:15 np0005626463.localdomain sudo[298172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4eee731cae5fdc18be0c10d2891fe31ef65d4d7fe265dcce23a01754e5da6e90-merged.mount: Deactivated successfully.
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.896949739 +0000 UTC m=+0.075854139 container create c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.42.2, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph)
Feb 23 09:47:15 np0005626463.localdomain systemd[1]: Started libpod-conmon-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope.
Feb 23 09:47:15 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.866388119 +0000 UTC m=+0.045292569 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.96729052 +0000 UTC m=+0.146194930 container init c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True)
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.977771689 +0000 UTC m=+0.156676099 container start c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 23 09:47:15 np0005626463.localdomain clever_kare[298221]: 167 167
Feb 23 09:47:15 np0005626463.localdomain systemd[1]: libpod-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope: Deactivated successfully.
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.979404089 +0000 UTC m=+0.158308489 container attach c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True)
Feb 23 09:47:15 np0005626463.localdomain podman[298206]: 2026-02-23 09:47:15.984419531 +0000 UTC m=+0.163323931 container died c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2)
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:16 np0005626463.localdomain podman[298226]: 2026-02-23 09:47:16.074100461 +0000 UTC m=+0.080924933 container remove c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, io.buildah.version=1.42.2, version=7, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.079 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:47:16 np0005626463.localdomain systemd[1]: libpod-conmon-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope: Deactivated successfully.
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:16 np0005626463.localdomain sudo[298172]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:16 np0005626463.localdomain sudo[298270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:16 np0005626463.localdomain sudo[298270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:16 np0005626463.localdomain sudo[298270]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:16 np0005626463.localdomain sudo[298288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:16 np0005626463.localdomain sudo[298288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.511 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.588 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.796 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.798 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11807MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.799 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.799 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:47:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f2d55bfe64961b6a0033d5af37241c0299c9a0526d0cd67a6752868fa1a936d9-merged.mount: Deactivated successfully.
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.876 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:47:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:16.925 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:47:16 np0005626463.localdomain podman[298324]: 
Feb 23 09:47:16 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:16.941136239 +0000 UTC m=+0.079466849 container create 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, distribution-scope=public)
Feb 23 09:47:16 np0005626463.localdomain systemd[1]: Started libpod-conmon-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope.
Feb 23 09:47:16 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:17 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:17.007673054 +0000 UTC m=+0.146003664 container init 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph)
Feb 23 09:47:17 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:16.910370463 +0000 UTC m=+0.048701113 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:17 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:17.023783114 +0000 UTC m=+0.162113724 container start 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:47:17 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:17.024031381 +0000 UTC m=+0.162362041 container attach 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, architecture=x86_64, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:47:17 np0005626463.localdomain kind_mestorf[298339]: 167 167
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: libpod-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope: Deactivated successfully.
Feb 23 09:47:17 np0005626463.localdomain podman[298324]: 2026-02-23 09:47:17.02723453 +0000 UTC m=+0.165565160 container died 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1770267347, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:47:17 np0005626463.localdomain podman[298344]: 2026-02-23 09:47:17.10973357 +0000 UTC m=+0.069235558 container remove 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, release=1770267347, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: libpod-conmon-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope: Deactivated successfully.
Feb 23 09:47:17 np0005626463.localdomain sudo[298288]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:17 np0005626463.localdomain sudo[298378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:17 np0005626463.localdomain sudo[298378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:17 np0005626463.localdomain sudo[298378]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1388002695' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:47:17 np0005626463.localdomain sudo[298396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:17 np0005626463.localdomain sudo[298396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:47:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/762953382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:17.389 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:47:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:17.397 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:47:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:17.422 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:47:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:17.424 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:47:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:17.425 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: tmp-crun.IzyYO4.mount: Deactivated successfully.
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-919defabe99933a0a258bd21b4a2c3d2b7cdde0f5b23926e44268ee7cf387d34-merged.mount: Deactivated successfully.
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.82962922 +0000 UTC m=+0.088718411 container create ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, version=7, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_BRANCH=main, io.buildah.version=1.42.2, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: Started libpod-conmon-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope.
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.788422526 +0000 UTC m=+0.047511747 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.90485609 +0000 UTC m=+0.163945321 container init ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.912466512 +0000 UTC m=+0.171555713 container start ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True)
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.912947496 +0000 UTC m=+0.172036787 container attach ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, name=rhceph, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container)
Feb 23 09:47:17 np0005626463.localdomain amazing_williams[298448]: 167 167
Feb 23 09:47:17 np0005626463.localdomain systemd[1]: libpod-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope: Deactivated successfully.
Feb 23 09:47:17 np0005626463.localdomain podman[298433]: 2026-02-23 09:47:17.916081131 +0000 UTC m=+0.175170332 container died ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:47:18 np0005626463.localdomain podman[298453]: 2026-02-23 09:47:18.019103177 +0000 UTC m=+0.090102513 container remove ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: libpod-conmon-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope: Deactivated successfully.
Feb 23 09:47:18 np0005626463.localdomain sudo[298396]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:18 np0005626463.localdomain sudo[298470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:18 np0005626463.localdomain sudo[298470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:18 np0005626463.localdomain sudo[298470]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:18 np0005626463.localdomain sudo[298488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:18 np0005626463.localdomain sudo[298488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/762953382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:18 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:47:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:18.424 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:47:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.696322037 +0000 UTC m=+0.075211739 container create 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: Started libpod-conmon-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope.
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.761945535 +0000 UTC m=+0.140835227 container init 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.665047526 +0000 UTC m=+0.043937238 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.769440774 +0000 UTC m=+0.148330466 container start 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, release=1770267347, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.76965533 +0000 UTC m=+0.148545042 container attach 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, vcs-type=git, io.openshift.expose-services=)
Feb 23 09:47:18 np0005626463.localdomain nifty_archimedes[298539]: 167 167
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: libpod-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope: Deactivated successfully.
Feb 23 09:47:18 np0005626463.localdomain podman[298523]: 2026-02-23 09:47:18.77163503 +0000 UTC m=+0.150524752 container died 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, ceph=True, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1770267347, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container)
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ecaa1c383fa4bc26139d8ccc82e0cbd570b8b02a52542aecba0e731f2ceeddc0-merged.mount: Deactivated successfully.
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-048046cdb304d4b4b16d8ee31efbca59063103a54e3e2012616f105b0f3e6f4f-merged.mount: Deactivated successfully.
Feb 23 09:47:18 np0005626463.localdomain podman[298544]: 2026-02-23 09:47:18.886926898 +0000 UTC m=+0.103196491 container remove 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1770267347)
Feb 23 09:47:18 np0005626463.localdomain systemd[1]: libpod-conmon-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope: Deactivated successfully.
Feb 23 09:47:18 np0005626463.localdomain sudo[298488]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:47:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.088 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:20.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:47:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:47:21 np0005626463.localdomain podman[298561]: 2026-02-23 09:47:21.912040107 +0000 UTC m=+0.085262916 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:47:21 np0005626463.localdomain podman[298561]: 2026-02-23 09:47:21.956350715 +0000 UTC m=+0.129573534 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:47:21 np0005626463.localdomain podman[298562]: 2026-02-23 09:47:21.957020005 +0000 UTC m=+0.127992425 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:47:21 np0005626463.localdomain podman[298562]: 2026-02-23 09:47:21.971227088 +0000 UTC m=+0.142199528 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:47:21 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:22 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.138075) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042138138, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 723, "num_deletes": 251, "total_data_size": 1044496, "memory_usage": 1057536, "flush_reason": "Manual Compaction"}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042146767, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 614951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13596, "largest_seqno": 14314, "table_properties": {"data_size": 611089, "index_size": 1589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10138, "raw_average_key_size": 21, "raw_value_size": 603016, "raw_average_value_size": 1288, "num_data_blocks": 66, "num_entries": 468, "num_filter_entries": 468, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840030, "oldest_key_time": 1771840030, "file_creation_time": 1771840042, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8738 microseconds, and 3242 cpu microseconds.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.146814) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 614951 bytes OK
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.146838) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148768) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148791) EVENT_LOG_v1 {"time_micros": 1771840042148785, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1040324, prev total WAL file size 1042176, number of live WAL files 2.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.150468) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(600KB)], [21(17MB)]
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042150540, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19005838, "oldest_snapshot_seqno": -1}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10346 keys, 15653784 bytes, temperature: kUnknown
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042243864, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15653784, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15593196, "index_size": 33607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25925, "raw_key_size": 279211, "raw_average_key_size": 26, "raw_value_size": 15414506, "raw_average_value_size": 1489, "num_data_blocks": 1277, "num_entries": 10346, "num_filter_entries": 10346, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840042, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.244231) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15653784 bytes
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.267474) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.4 rd, 167.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.5 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(56.4) write-amplify(25.5) OK, records in: 10870, records dropped: 524 output_compression: NoCompression
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.267508) EVENT_LOG_v1 {"time_micros": 1771840042267491, "job": 10, "event": "compaction_finished", "compaction_time_micros": 93449, "compaction_time_cpu_micros": 43555, "output_level": 6, "num_output_files": 1, "total_output_size": 15653784, "num_input_records": 10870, "num_output_records": 10346, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042267921, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042270691, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.150396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:47:22 np0005626463.localdomain sshd[298608]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='client.27040 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626460.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: Added label _no_schedule to host np0005626460.localdomain
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626460.localdomain
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:23 np0005626463.localdomain sshd[298608]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:47:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:47:24 np0005626463.localdomain podman[298610]: 2026-02-23 09:47:24.907416961 +0000 UTC m=+0.081059689 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:47:24 np0005626463.localdomain podman[298610]: 2026-02-23 09:47:24.946367335 +0000 UTC m=+0.120010083 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:47:24 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:25.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='client.44247 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626460.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"} : dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"} : dispatch
Feb 23 09:47:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"}]': finished
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: from='client.34354 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626460.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: Removed host np0005626460.localdomain
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:47:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:47:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:47:28 np0005626463.localdomain podman[298628]: 2026-02-23 09:47:28.946168926 +0000 UTC m=+0.115892748 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Feb 23 09:47:28 np0005626463.localdomain podman[298628]: 2026-02-23 09:47:28.980230363 +0000 UTC m=+0.149954215 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:47:28 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:29 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.123 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:30 np0005626463.localdomain sudo[298646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:47:30 np0005626463.localdomain sudo[298646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.163 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:30 np0005626463.localdomain sudo[298646]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:30.164 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:47:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:32 np0005626463.localdomain ceph-mon[294160]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:32 np0005626463.localdomain sudo[298664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:47:32 np0005626463.localdomain sudo[298664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:32 np0005626463.localdomain sudo[298664]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='client.34385 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:47:33 np0005626463.localdomain ceph-mon[294160]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:34 np0005626463.localdomain ceph-mon[294160]: from='client.34362 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626465", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:47:34 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Feb 23 09:47:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@3(peon) e11  my rank is now 2 (was 3)
Feb 23 09:47:34 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:47:34 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:47:34 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:47:34 np0005626463.localdomain ceph-mon[294160]: paxos.2).electionLogic(44) init, last seen epoch 44
Feb 23 09:47:34 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea62c4e000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Feb 23 09:47:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.198 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:35.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:36 np0005626463.localdomain sshd[298682]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:47:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:47:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:47:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:47:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:47:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:47:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1"
Feb 23 09:47:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: paxos.2).electionLogic(45) init, last seen epoch 45, mid-election, bumping
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:39 np0005626463.localdomain sshd[298682]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:47:39 np0005626463.localdomain systemd[1]: tmp-crun.BIXF8w.mount: Deactivated successfully.
Feb 23 09:47:39 np0005626463.localdomain podman[298684]: 2026-02-23 09:47:39.917759537 +0000 UTC m=+0.088084802 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:47:39 np0005626463.localdomain podman[298684]: 2026-02-23 09:47:39.950630897 +0000 UTC m=+0.120956192 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:47:39 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626465
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626465: new quorum should be ['np0005626461', 'np0005626466', 'np0005626463'] (from ['np0005626461', 'np0005626466', 'np0005626463'])
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626465 from monmap...
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626465 from np0005626465.localdomain -- ports []
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466 in quorum (ranks 0,1)
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 calling monitor election
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626463 in quorum (ranks 0,1,2)
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: monmap epoch 11
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:47:34.853919+0000
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626461
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: osdmap e83: 6 total, 6 up, 6 in
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: mgrmap e27: np0005626465.hlpkwo(active, since 50s), standbys: np0005626463.wtksup, np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:47:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:39 np0005626463.localdomain sudo[298707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:47:39 np0005626463.localdomain sudo[298707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:39 np0005626463.localdomain sudo[298707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:47:40 np0005626463.localdomain sudo[298725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298725]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:40 np0005626463.localdomain sudo[298743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298743]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:40 np0005626463.localdomain sudo[298761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298761]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:40.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:40 np0005626463.localdomain sudo[298779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:40 np0005626463.localdomain sudo[298779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:40 np0005626463.localdomain sudo[298813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298813]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:47:40 np0005626463.localdomain sudo[298831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298831]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:47:40 np0005626463.localdomain sudo[298849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298849]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:47:40 np0005626463.localdomain sudo[298867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298867]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:47:40 np0005626463.localdomain sudo[298885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298885]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:40 np0005626463.localdomain sudo[298903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298903]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:40 np0005626463.localdomain sudo[298921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:40 np0005626463.localdomain sudo[298921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:40 np0005626463.localdomain sudo[298921]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:41 np0005626463.localdomain sudo[298939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:41 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:41 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:41 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:41 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:47:41 np0005626463.localdomain sudo[298939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:41 np0005626463.localdomain sudo[298939]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:41 np0005626463.localdomain sudo[298973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:41 np0005626463.localdomain sudo[298973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:41 np0005626463.localdomain sudo[298973]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:41 np0005626463.localdomain sudo[298991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:47:41 np0005626463.localdomain sudo[298991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:41 np0005626463.localdomain sudo[298991]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:41 np0005626463.localdomain sudo[299009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:41 np0005626463.localdomain sudo[299009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:41 np0005626463.localdomain sudo[299009]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:41 np0005626463.localdomain sudo[299027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:47:41 np0005626463.localdomain sudo[299027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:41 np0005626463.localdomain sudo[299027]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:43 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:47:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:47:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:47:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:47:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:47:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:47:43 np0005626463.localdomain sudo[299045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:43 np0005626463.localdomain sudo[299045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:47:43 np0005626463.localdomain sudo[299045]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:44 np0005626463.localdomain sudo[299069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:44 np0005626463.localdomain sudo[299069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:44 np0005626463.localdomain podman[299063]: 2026-02-23 09:47:44.014129139 +0000 UTC m=+0.067054861 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 09:47:44 np0005626463.localdomain podman[299063]: 2026-02-23 09:47:44.027219238 +0000 UTC m=+0.080144950 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:47:44 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.404986366 +0000 UTC m=+0.061064380 container create 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7)
Feb 23 09:47:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope.
Feb 23 09:47:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.373367172 +0000 UTC m=+0.029445206 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.473195451 +0000 UTC m=+0.129273465 container init 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=)
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.482761822 +0000 UTC m=+0.138839836 container start 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, com.redhat.component=rhceph-container)
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.483053501 +0000 UTC m=+0.139131555 container attach 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Feb 23 09:47:44 np0005626463.localdomain confident_gagarin[299136]: 167 167
Feb 23 09:47:44 np0005626463.localdomain systemd[1]: libpod-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope: Deactivated successfully.
Feb 23 09:47:44 np0005626463.localdomain podman[299120]: 2026-02-23 09:47:44.48599914 +0000 UTC m=+0.142077164 container died 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 09:47:44 np0005626463.localdomain podman[299141]: 2026-02-23 09:47:44.574226126 +0000 UTC m=+0.076886302 container remove 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, release=1770267347, ceph=True, io.buildah.version=1.42.2, vcs-type=git, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Feb 23 09:47:44 np0005626463.localdomain systemd[1]: libpod-conmon-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope: Deactivated successfully.
Feb 23 09:47:44 np0005626463.localdomain sudo[299069]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:44 np0005626463.localdomain sudo[299158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:44 np0005626463.localdomain sudo[299158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:44 np0005626463.localdomain sudo[299158]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:44 np0005626463.localdomain sudo[299176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:44 np0005626463.localdomain sudo[299176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2b81b50680d3517ec08aedb96ce724e8acdef65e24c8832a4bc0eaa323b463f7-merged.mount: Deactivated successfully.
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:45.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.281998967 +0000 UTC m=+0.085013729 container create 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1770267347, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:47:45 np0005626463.localdomain systemd[1]: Started libpod-conmon-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope.
Feb 23 09:47:45 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.241673859 +0000 UTC m=+0.044688641 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.341490388 +0000 UTC m=+0.144505150 container init 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, release=1770267347, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph)
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.350159441 +0000 UTC m=+0.153174233 container start 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.350963446 +0000 UTC m=+0.153978208 container attach 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=)
Feb 23 09:47:45 np0005626463.localdomain recursing_franklin[299225]: 167 167
Feb 23 09:47:45 np0005626463.localdomain systemd[1]: libpod-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope: Deactivated successfully.
Feb 23 09:47:45 np0005626463.localdomain podman[299211]: 2026-02-23 09:47:45.35569792 +0000 UTC m=+0.158712702 container died 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:47:45 np0005626463.localdomain podman[299230]: 2026-02-23 09:47:45.441807031 +0000 UTC m=+0.079063368 container remove 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, GIT_BRANCH=main, release=1770267347)
Feb 23 09:47:45 np0005626463.localdomain systemd[1]: libpod-conmon-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope: Deactivated successfully.
Feb 23 09:47:45 np0005626463.localdomain sudo[299176]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:45 np0005626463.localdomain sudo[299254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:45 np0005626463.localdomain sudo[299254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:45 np0005626463.localdomain sudo[299254]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:45 np0005626463.localdomain sudo[299272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:45 np0005626463.localdomain sudo[299272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: tmp-crun.F5REz1.mount: Deactivated successfully.
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2620c415444659a5063586f167c4dca73244797973e0676666d4759a58f94e55-merged.mount: Deactivated successfully.
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.328655422 +0000 UTC m=+0.080504012 container create a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, build-date=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: Started libpod-conmon-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope.
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.294172222 +0000 UTC m=+0.046020862 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.395843096 +0000 UTC m=+0.147691686 container init a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347)
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.402683865 +0000 UTC m=+0.154532455 container start a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1770267347, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z)
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.402963953 +0000 UTC m=+0.154812543 container attach a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph)
Feb 23 09:47:46 np0005626463.localdomain agitated_turing[299321]: 167 167
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: libpod-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope: Deactivated successfully.
Feb 23 09:47:46 np0005626463.localdomain podman[299306]: 2026-02-23 09:47:46.406302565 +0000 UTC m=+0.158151165 container died a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347)
Feb 23 09:47:46 np0005626463.localdomain podman[299326]: 2026-02-23 09:47:46.542533061 +0000 UTC m=+0.124368106 container remove a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:47:46 np0005626463.localdomain systemd[1]: libpod-conmon-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope: Deactivated successfully.
Feb 23 09:47:46 np0005626463.localdomain sudo[299272]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:46 np0005626463.localdomain sudo[299348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:46 np0005626463.localdomain sudo[299348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:46 np0005626463.localdomain sudo[299348]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:46 np0005626463.localdomain sudo[299366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:46 np0005626463.localdomain sudo[299366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-acc7bffc5d9a93190881b278be83339f28a0e53f0b17123632f22af7904d06b8-merged.mount: Deactivated successfully.
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.40138487 +0000 UTC m=+0.078037416 container create 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, release=1770267347, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Feb 23 09:47:47 np0005626463.localdomain systemd[1]: Started libpod-conmon-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope.
Feb 23 09:47:47 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.468235324 +0000 UTC m=+0.144887860 container init 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=)
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.370086227 +0000 UTC m=+0.046738803 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.480390604 +0000 UTC m=+0.157043150 container start 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.480727534 +0000 UTC m=+0.157380120 container attach 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.42.2, name=rhceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Feb 23 09:47:47 np0005626463.localdomain eager_bose[299415]: 167 167
Feb 23 09:47:47 np0005626463.localdomain systemd[1]: libpod-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope: Deactivated successfully.
Feb 23 09:47:47 np0005626463.localdomain podman[299400]: 2026-02-23 09:47:47.483564171 +0000 UTC m=+0.160216697 container died 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:47:47 np0005626463.localdomain podman[299420]: 2026-02-23 09:47:47.572630731 +0000 UTC m=+0.079829990 container remove 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:47:47 np0005626463.localdomain systemd[1]: libpod-conmon-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope: Deactivated successfully.
Feb 23 09:47:47 np0005626463.localdomain sudo[299366]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:47 np0005626463.localdomain sudo[299438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:47:47 np0005626463.localdomain sudo[299438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:47 np0005626463.localdomain sudo[299438]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:47 np0005626463.localdomain sudo[299456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:47:47 np0005626463.localdomain sudo[299456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: tmp-crun.wEDGYC.mount: Deactivated successfully.
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-08631254c1bab725bd764818c5a74919dac759a151daafbc9f661d84e3787234-merged.mount: Deactivated successfully.
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:47:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.315725168 +0000 UTC m=+0.077493060 container create 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=)
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope.
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.376739254 +0000 UTC m=+0.138507166 container init 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, GIT_CLEAN=True)
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.282504267 +0000 UTC m=+0.044272189 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.386042718 +0000 UTC m=+0.147810610 container start 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, name=rhceph, release=1770267347, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.386308156 +0000 UTC m=+0.148076048 container attach 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Feb 23 09:47:48 np0005626463.localdomain cranky_ptolemy[299507]: 167 167
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: libpod-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope: Deactivated successfully.
Feb 23 09:47:48 np0005626463.localdomain podman[299492]: 2026-02-23 09:47:48.390039739 +0000 UTC m=+0.151807631 container died 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, release=1770267347, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Feb 23 09:47:48 np0005626463.localdomain podman[299512]: 2026-02-23 09:47:48.484837815 +0000 UTC m=+0.086305308 container remove 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1770267347, distribution-scope=public, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:47:48 np0005626463.localdomain systemd[1]: libpod-conmon-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope: Deactivated successfully.
Feb 23 09:47:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:47:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:47:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:47:48.550 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:47:48 np0005626463.localdomain sudo[299456]: pam_unix(sudo:session): session closed for user root
Feb 23 09:47:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:47:48.551 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:47:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-86a5054a0f3bf282b2b900f7b905344455fccf51ecc22bf3909309fbe85f6045-merged.mount: Deactivated successfully.
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='client.34402 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005626465.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: Deploying daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.281 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.281 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:50.322 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:51 np0005626463.localdomain ceph-mon[294160]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:47:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:47:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:47:52 np0005626463.localdomain podman[299529]: 2026-02-23 09:47:52.952068985 +0000 UTC m=+0.115395803 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:47:52 np0005626463.localdomain podman[299529]: 2026-02-23 09:47:52.964363009 +0000 UTC m=+0.127689827 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:47:52 np0005626463.localdomain systemd[1]: tmp-crun.pg6iNH.mount: Deactivated successfully.
Feb 23 09:47:52 np0005626463.localdomain podman[299528]: 2026-02-23 09:47:52.979891581 +0000 UTC m=+0.143173988 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:47:52 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:47:53 np0005626463.localdomain podman[299528]: 2026-02-23 09:47:53.07412295 +0000 UTC m=+0.237405387 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 23 09:47:53 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:47:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.323 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.354 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:47:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:47:55.355 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:47:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:47:55 np0005626463.localdomain podman[299578]: 2026-02-23 09:47:55.915100444 +0000 UTC m=+0.089174926 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 09:47:55 np0005626463.localdomain podman[299578]: 2026-02-23 09:47:55.954799532 +0000 UTC m=+0.128873994 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Feb 23 09:47:55 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:47:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:47:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:47:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:47:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:47:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:47:59 np0005626463.localdomain systemd[1]: tmp-crun.7YqP8P.mount: Deactivated successfully.
Feb 23 09:47:59 np0005626463.localdomain podman[299597]: 2026-02-23 09:47:59.924059325 +0000 UTC m=+0.099442847 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:47:59 np0005626463.localdomain podman[299597]: 2026-02-23 09:47:59.932231714 +0000 UTC m=+0.107615156 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:47:59 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.356 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.357 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:00.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:00 np0005626463.localdomain sudo[299613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:00 np0005626463.localdomain sudo[299613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:00 np0005626463.localdomain sudo[299613]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:00 np0005626463.localdomain sudo[299631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:48:00 np0005626463.localdomain sudo[299631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:01 np0005626463.localdomain sudo[299631]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Feb 23 09:48:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2706546237' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2706546237' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:03 np0005626463.localdomain sudo[299682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:48:03 np0005626463.localdomain sudo[299682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:03 np0005626463.localdomain sudo[299682]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='client.34411 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: Reconfig service osd.default_drive_group
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:03 np0005626463.localdomain sudo[299700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:03 np0005626463.localdomain sudo[299700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:03 np0005626463.localdomain sudo[299700]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:03 np0005626463.localdomain sudo[299718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:03 np0005626463.localdomain sudo[299718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.04971057 +0000 UTC m=+0.074750997 container create 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, io.buildah.version=1.42.2)
Feb 23 09:48:04 np0005626463.localdomain systemd[1]: Started libpod-conmon-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope.
Feb 23 09:48:04 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.117040279 +0000 UTC m=+0.142080746 container init 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.021659426 +0000 UTC m=+0.046699933 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.178989775 +0000 UTC m=+0.204030232 container start 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, CEPH_POINT_RELEASE=)
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.17980659 +0000 UTC m=+0.204847057 container attach 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:48:04 np0005626463.localdomain cool_hertz[299767]: 167 167
Feb 23 09:48:04 np0005626463.localdomain systemd[1]: libpod-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope: Deactivated successfully.
Feb 23 09:48:04 np0005626463.localdomain podman[299752]: 2026-02-23 09:48:04.183508983 +0000 UTC m=+0.208549460 container died 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:48:04 np0005626463.localdomain podman[299772]: 2026-02-23 09:48:04.26822414 +0000 UTC m=+0.073460016 container remove 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_BRANCH=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main)
Feb 23 09:48:04 np0005626463.localdomain systemd[1]: libpod-conmon-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope: Deactivated successfully.
Feb 23 09:48:04 np0005626463.localdomain sudo[299718]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1406118199' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1406118199' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:04 np0005626463.localdomain sudo[299795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:04 np0005626463.localdomain sudo[299795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:04 np0005626463.localdomain sudo[299795]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:04 np0005626463.localdomain sudo[299813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:04 np0005626463.localdomain sudo[299813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: tmp-crun.HnamUS.mount: Deactivated successfully.
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-956ac6c11b6a2b90a6335f65b451c5bd114a3c426040e7a4dbad42676f695186-merged.mount: Deactivated successfully.
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.107270857 +0000 UTC m=+0.072423005 container create e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: Started libpod-conmon-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope.
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.168220791 +0000 UTC m=+0.133372919 container init e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, version=7, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64)
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.076780129 +0000 UTC m=+0.041932317 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.176851734 +0000 UTC m=+0.142003882 container start e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:48:05 np0005626463.localdomain cool_heisenberg[299862]: 167 167
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.177259227 +0000 UTC m=+0.142411365 container attach e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, release=1770267347, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main)
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: libpod-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope: Deactivated successfully.
Feb 23 09:48:05 np0005626463.localdomain podman[299847]: 2026-02-23 09:48:05.179691411 +0000 UTC m=+0.144843609 container died e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2)
Feb 23 09:48:05 np0005626463.localdomain podman[299867]: 2026-02-23 09:48:05.27690811 +0000 UTC m=+0.081427170 container remove e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, RELEASE=main, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z)
Feb 23 09:48:05 np0005626463.localdomain systemd[1]: libpod-conmon-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope: Deactivated successfully.
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:05 np0005626463.localdomain sudo[299813]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.472 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:05.473 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:06 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-545d1b1a2a35f22d6d3556b018dd151a0d75ba70343262c5036d92c3644dae9e-merged.mount: Deactivated successfully.
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3009308721' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 e84: 6 total, 6 up, 6 in
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr handle_mgr_map Activating!
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr handle_mgr_map I am now activating
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626461"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 1
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: balancer
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Starting
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Optimize plan auto_2026-02-23_09:48:07
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Feb 23 09:48:07 np0005626463.localdomain sshd[296382]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [cephadm WARNING root] removing stray HostCache host record np0005626460.localdomain.devices.0
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005626460.localdomain.devices.0
Feb 23 09:48:07 np0005626463.localdomain systemd[1]: session-68.scope: Deactivated successfully.
Feb 23 09:48:07 np0005626463.localdomain systemd[1]: session-68.scope: Consumed 20.524s CPU time.
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain systemd-logind[759]: Session 68 logged out. Waiting for processes to exit.
Feb 23 09:48:07 np0005626463.localdomain systemd-logind[759]: Removed session 68.
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} v 0)
Feb 23 09:48:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: cephadm
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: crash
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: devicehealth
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: iostat
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [devicehealth INFO root] Starting
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: nfs
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: orchestrator
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: pg_autoscaler
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: progress
Feb 23 09:48:07 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] _maybe_adjust
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Loading...
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f4408fe6ac0>, <progress.module.GhostEvent object at 0x7f4408fe6d60>, <progress.module.GhostEvent object at 0x7f4408fe6d90>, <progress.module.GhostEvent object at 0x7f4408fe6dc0>, <progress.module.GhostEvent object at 0x7f4408fe6df0>, <progress.module.GhostEvent object at 0x7f4408fe6e20>, <progress.module.GhostEvent object at 0x7f4408fe6e50>, <progress.module.GhostEvent object at 0x7f4408fe6e80>, <progress.module.GhostEvent object at 0x7f4408fe6eb0>, <progress.module.GhostEvent object at 0x7f4408fe6ee0>, <progress.module.GhostEvent object at 0x7f4408fe6f10>, <progress.module.GhostEvent object at 0x7f4408fe6f40>, <progress.module.GhostEvent object at 0x7f4408fe6f70>, <progress.module.GhostEvent object at 0x7f4408fe6fa0>, <progress.module.GhostEvent object at 0x7f4408fe6fd0>, <progress.module.GhostEvent object at 0x7f4408f70040>, <progress.module.GhostEvent object at 0x7f4408f70070>, <progress.module.GhostEvent object at 0x7f4408f700a0>, <progress.module.GhostEvent object at 0x7f4408f700d0>, <progress.module.GhostEvent object at 0x7f4408f70100>, <progress.module.GhostEvent object at 0x7f4408f70130>, <progress.module.GhostEvent object at 0x7f4408f70160>, <progress.module.GhostEvent object at 0x7f4408f70190>, <progress.module.GhostEvent object at 0x7f4408f701c0>, <progress.module.GhostEvent object at 0x7f4408f701f0>, <progress.module.GhostEvent object at 0x7f4408f70220>, <progress.module.GhostEvent object at 0x7f4408f70250>, <progress.module.GhostEvent object at 0x7f4408f70280>, <progress.module.GhostEvent object at 0x7f4408f702b0>, <progress.module.GhostEvent object at 0x7f4408f702e0>, <progress.module.GhostEvent object at 0x7f4408f70310>, <progress.module.GhostEvent object at 0x7f4408f70340>, <progress.module.GhostEvent object at 0x7f4408f70370>, <progress.module.GhostEvent object at 0x7f4408f703a0>, <progress.module.GhostEvent object at 0x7f4408f703d0>, <progress.module.GhostEvent object at 0x7f4408f70400>, <progress.module.GhostEvent object at 0x7f4408f70430>, <progress.module.GhostEvent object at 0x7f4408f70460>, <progress.module.GhostEvent object at 0x7f4408f70490>, <progress.module.GhostEvent object at 0x7f4408f704c0>, <progress.module.GhostEvent object at 0x7f4408f704f0>, <progress.module.GhostEvent object at 0x7f4408f70520>, <progress.module.GhostEvent object at 0x7f4408f70550>, <progress.module.GhostEvent object at 0x7f4408f70580>, <progress.module.GhostEvent object at 0x7f4408f705b0>, <progress.module.GhostEvent object at 0x7f4408f705e0>, <progress.module.GhostEvent object at 0x7f4408f70610>, <progress.module.GhostEvent object at 0x7f4408f70640>, <progress.module.GhostEvent object at 0x7f4408f70670>, <progress.module.GhostEvent object at 0x7f4408f706a0>] historic events
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Loaded OSDMap, ready.
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] recovery thread starting
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] starting setup
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: rbd_support
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: restful
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [restful INFO root] server_addr: :: server_port: 8003
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: status
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [restful WARNING root] server not running: no certificate configured
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: telemetry
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} v 0)
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] PerfHandler: starting
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_task_task: vms, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: mgr load Constructed class from module: volumes
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_task_task: volumes, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_task_task: images, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_task_task: backups, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] TaskHandler: starting
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} v 0)
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] setup complete
Feb 23 09:48:08 np0005626463.localdomain sshd[300031]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:48:08 np0005626463.localdomain sshd[300031]: Accepted publickey for ceph-admin from 192.168.122.106 port 43074 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:48:08 np0005626463.localdomain systemd-logind[759]: New session 69 of user ceph-admin.
Feb 23 09:48:08 np0005626463.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Feb 23 09:48:08 np0005626463.localdomain sshd[300031]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:48:08 np0005626463.localdomain sudo[300035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:08 np0005626463.localdomain sudo[300035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:08 np0005626463.localdomain sudo[300035]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:08 np0005626463.localdomain sudo[300053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:48:08 np0005626463.localdomain sudo[300053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/3009308721' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: Activating manager daemon np0005626463.wtksup
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: mgrmap e28: np0005626463.wtksup(active, starting, since 0.0487667s), standbys: np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: Manager daemon np0005626463.wtksup is now available
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: removing stray HostCache host record np0005626460.localdomain.devices.0
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"}]': finished
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"}]': finished
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch
Feb 23 09:48:08 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:09 np0005626463.localdomain podman[300142]: 2026-02-23 09:48:09.282972863 +0000 UTC m=+0.118534229 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, ceph=True, io.buildah.version=1.42.2, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container)
Feb 23 09:48:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:48:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:48:09 np0005626463.localdomain podman[300142]: 2026-02-23 09:48:09.445797798 +0000 UTC m=+0.281359214 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, release=1770267347, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, ceph=True)
Feb 23 09:48:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:48:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Bus STARTING
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Bus STARTING
Feb 23 09:48:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:48:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18267 "" "Go-http-client/1.1"
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Bus STARTED
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Bus STARTED
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: mgrmap e29: np0005626463.wtksup(active, since 1.06357s), standbys: np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:10 np0005626463.localdomain sudo[300053]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:10 np0005626463.localdomain podman[300273]: 2026-02-23 09:48:10.090341885 +0000 UTC m=+0.088519155 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:10 np0005626463.localdomain ceph-mgr[288036]: [devicehealth INFO root] Check health
Feb 23 09:48:10 np0005626463.localdomain podman[300273]: 2026-02-23 09:48:10.176166907 +0000 UTC m=+0.174344167 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:10 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:10 np0005626463.localdomain sudo[300323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:10 np0005626463.localdomain sudo[300323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:10 np0005626463.localdomain sudo[300323]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:10 np0005626463.localdomain sudo[300341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:48:10 np0005626463.localdomain sudo[300341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.473 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:10.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:10 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:10 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:10 np0005626463.localdomain sudo[300341]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Bus STARTING
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Bus STARTED
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/155984398' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mgrmap e30: np0005626463.wtksup(active, since 3s), standbys: np0005626466.nisqfq, np0005626461.lrfquh, np0005626460.fyrady
Feb 23 09:48:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:11 np0005626463.localdomain sudo[300392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:11 np0005626463.localdomain sudo[300392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:11 np0005626463.localdomain sudo[300392]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:11 np0005626463.localdomain sudo[300410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:48:11 np0005626463.localdomain sudo[300410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:11 np0005626463.localdomain sshd[300428]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain sudo[300410]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:11 np0005626463.localdomain sudo[300450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:48:11 np0005626463.localdomain sudo[300450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:11 np0005626463.localdomain sudo[300450]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:11 np0005626463.localdomain sudo[300468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:48:11 np0005626463.localdomain sudo[300468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:11 np0005626463.localdomain sudo[300468]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300486]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: mgrmap e31: np0005626463.wtksup(active, since 3s), standbys: np0005626466.nisqfq, np0005626461.lrfquh
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2718338230' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain sudo[300504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:12 np0005626463.localdomain sudo[300504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300504]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300522]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300556]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300574]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sshd[300595]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain sshd[300428]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain sudo[300592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain sudo[300592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300592]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:12 np0005626463.localdomain sudo[300612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:12 np0005626463.localdomain sudo[300612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300612]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:12 np0005626463.localdomain sudo[300630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300630]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300648]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mgr.np0005626465.hlpkwo 172.18.0.107:0/3454997775; not ready for session (expect reconnect)
Feb 23 09:48:12 np0005626463.localdomain sudo[300666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:12 np0005626463.localdomain sudo[300666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300666]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} v 0)
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain sudo[300684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:12 np0005626463.localdomain sudo[300684]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:12 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:12 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:12 np0005626463.localdomain sudo[300718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300718]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:12 np0005626463.localdomain sudo[300736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:12 np0005626463.localdomain sudo[300736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:12 np0005626463.localdomain sudo[300736]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:13 np0005626463.localdomain sudo[300754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300754]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: Standby manager daemon np0005626465.hlpkwo started
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 4s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain sudo[300772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:48:13 np0005626463.localdomain sudo[300772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300772]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:48:13 np0005626463.localdomain sudo[300790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300790]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[300808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300808]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:13 np0005626463.localdomain sudo[300826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300826]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[300844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300844]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:48:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:48:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:48:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:48:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:48:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:48:13 np0005626463.localdomain sudo[300878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[300878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300878]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[300896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300896]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain sudo[300914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300914]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain sudo[300932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:13 np0005626463.localdomain sudo[300932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain sudo[300932]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[300950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:13 np0005626463.localdomain sudo[300950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300950]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:13 np0005626463.localdomain sudo[300968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[300968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[300968]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.783 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.783 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.784 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:48:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:13.784 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:48:13 np0005626463.localdomain sudo[300986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:13 np0005626463.localdomain sudo[300986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:13 np0005626463.localdomain sudo[300986]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain sudo[301004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[301004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[301004]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:13 np0005626463.localdomain sudo[301038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:48:13 np0005626463.localdomain sudo[301038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:13 np0005626463.localdomain sudo[301038]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:14 np0005626463.localdomain sudo[301056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:48:14 np0005626463.localdomain sudo[301056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:14 np0005626463.localdomain sudo[301056]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2551777660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:14.139 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:48:14 np0005626463.localdomain sudo[301075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:14 np0005626463.localdomain sudo[301075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:14 np0005626463.localdomain sudo[301075]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:14 np0005626463.localdomain podman[301074]: 2026-02-23 09:48:14.176005792 +0000 UTC m=+0.098759058 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=)
Feb 23 09:48:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:14.177 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:48:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:14.178 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:48:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:14.178 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:14 np0005626463.localdomain podman[301074]: 2026-02-23 09:48:14.191357178 +0000 UTC m=+0.114110484 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, build-date=2026-02-05T04:57:10Z, release=1770267347, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Feb 23 09:48:14 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain sudo[301112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:48:14 np0005626463.localdomain sudo[301112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:14 np0005626463.localdomain sudo[301112]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:14 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:14 np0005626463.localdomain sshd[300595]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1896965894' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:15.576 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:15 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:15 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:48:15 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:48:15 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:16 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:16 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:16 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:48:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:17.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:17.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:17.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:17.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:17 np0005626463.localdomain sudo[301130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:48:17 np0005626463.localdomain sudo[301130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:17 np0005626463.localdomain sudo[301130]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:17 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:17 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:17 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 23 09:48:18 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.081 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.082 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.082 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.083 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.083 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1681775718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.542 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.698 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.699 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:48:18 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:18 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.930 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11723MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.932 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:48:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:18.933 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.002 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.002 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.003 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.045 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1681775718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2580160333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.487 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.494 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.541 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.544 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:48:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:19.544 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:48:19 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34459 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:19 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:19 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:19 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2580160333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: from='client.34459 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.541 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.542 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.576 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.578 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:20.618 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:20 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:20 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:21 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:21 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:21 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.27135 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Saving service mon spec with placement label:mon
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='client.27135 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain sudo[301192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:48:22 np0005626463.localdomain sudo[301192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:22 np0005626463.localdomain sudo[301192]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:22 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)...
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34471 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626465", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:23 np0005626463.localdomain sudo[301210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:23 np0005626463.localdomain sudo[301210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:48:23 np0005626463.localdomain sudo[301210]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:48:23 np0005626463.localdomain sudo[301235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:23 np0005626463.localdomain sudo[301235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:23 np0005626463.localdomain podman[301228]: 2026-02-23 09:48:23.640440531 +0000 UTC m=+0.085888305 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 09:48:23 np0005626463.localdomain podman[301229]: 2026-02-23 09:48:23.720998123 +0000 UTC m=+0.163152356 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:48:23 np0005626463.localdomain podman[301229]: 2026-02-23 09:48:23.736213625 +0000 UTC m=+0.178367848 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:48:23 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:48:23 np0005626463.localdomain podman[301228]: 2026-02-23 09:48:23.752571803 +0000 UTC m=+0.198019647 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:48:23 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.065184328 +0000 UTC m=+0.089544306 container create d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=)
Feb 23 09:48:24 np0005626463.localdomain systemd[1]: Started libpod-conmon-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope.
Feb 23 09:48:24 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.030712629 +0000 UTC m=+0.055072587 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.129503376 +0000 UTC m=+0.153863294 container init d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, release=1770267347, name=rhceph, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.139343425 +0000 UTC m=+0.163703383 container start d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z)
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.139748417 +0000 UTC m=+0.164108355 container attach d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, build-date=2026-02-09T10:25:24Z, ceph=True, release=1770267347, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:48:24 np0005626463.localdomain hungry_ritchie[301327]: 167 167
Feb 23 09:48:24 np0005626463.localdomain systemd[1]: libpod-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope: Deactivated successfully.
Feb 23 09:48:24 np0005626463.localdomain podman[301312]: 2026-02-23 09:48:24.143309156 +0000 UTC m=+0.167669074 container died d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=)
Feb 23 09:48:24 np0005626463.localdomain podman[301332]: 2026-02-23 09:48:24.242661799 +0000 UTC m=+0.086058450 container remove d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Feb 23 09:48:24 np0005626463.localdomain systemd[1]: libpod-conmon-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope: Deactivated successfully.
Feb 23 09:48:24 np0005626463.localdomain sudo[301235]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='client.34471 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626465", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.458376) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104458429, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2818, "num_deletes": 256, "total_data_size": 8432089, "memory_usage": 8912864, "flush_reason": "Manual Compaction"}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104496724, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5016845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14319, "largest_seqno": 17132, "table_properties": {"data_size": 5004990, "index_size": 7400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 30199, "raw_average_key_size": 22, "raw_value_size": 4979249, "raw_average_value_size": 3732, "num_data_blocks": 322, "num_entries": 1334, "num_filter_entries": 1334, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840042, "oldest_key_time": 1771840042, "file_creation_time": 1771840104, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 38415 microseconds, and 10536 cpu microseconds.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.496787) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5016845 bytes OK
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.496816) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499651) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499679) EVENT_LOG_v1 {"time_micros": 1771840104499672, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499699) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8418371, prev total WAL file size 8443858, number of live WAL files 2.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.501268) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4899KB)], [24(14MB)]
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104501324, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20670629, "oldest_snapshot_seqno": -1}
Feb 23 09:48:24 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a2f285afc8e7edd2b875030538209a94f5477e72cdfff44d1bb335ee3fea9dcf-merged.mount: Deactivated successfully.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11131 keys, 18458382 bytes, temperature: kUnknown
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104660494, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18458382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18392184, "index_size": 37297, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 298013, "raw_average_key_size": 26, "raw_value_size": 18199312, "raw_average_value_size": 1635, "num_data_blocks": 1436, "num_entries": 11131, "num_filter_entries": 11131, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840104, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.660858) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18458382 bytes
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.662724) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.8 rd, 115.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 14.9 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(7.8) write-amplify(3.7) OK, records in: 11680, records dropped: 549 output_compression: NoCompression
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.662754) EVENT_LOG_v1 {"time_micros": 1771840104662741, "job": 12, "event": "compaction_finished", "compaction_time_micros": 159264, "compaction_time_cpu_micros": 53154, "output_level": 6, "num_output_files": 1, "total_output_size": 18458382, "num_input_records": 11680, "num_output_records": 11131, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104663641, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104666292, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.501189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:24 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/188302181' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.659 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.661 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.661 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:25.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:25 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:25 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:25 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:26 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:48:26 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:26 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:48:26 np0005626463.localdomain systemd[1]: tmp-crun.9j97Ex.mount: Deactivated successfully.
Feb 23 09:48:26 np0005626463.localdomain podman[301349]: 2026-02-23 09:48:26.953501694 +0000 UTC m=+0.084300677 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:48:26 np0005626463.localdomain podman[301349]: 2026-02-23 09:48:26.989382756 +0000 UTC m=+0.120181709 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:48:27 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:48:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:27 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:27 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:27 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:28 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:28 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:28 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:29 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:29 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:29 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/1434680955' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 23 09:48:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:30 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:30.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:30 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:30 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:48:30 np0005626463.localdomain systemd[1]: tmp-crun.CWCxBC.mount: Deactivated successfully.
Feb 23 09:48:30 np0005626463.localdomain podman[301368]: 2026-02-23 09:48:30.915980661 +0000 UTC m=+0.097075644 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:48:30 np0005626463.localdomain podman[301368]: 2026-02-23 09:48:30.922077038 +0000 UTC m=+0.103171981 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:48:30 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:48:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:31 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:31 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:31 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:32 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:32 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:32 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:33 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:33 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:33 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:34 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44399 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626461", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:34 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:34 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:34 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.27159 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626461"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Remove daemons mon.np0005626461
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005626461
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463'])
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463'])
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon rm", "name": "np0005626461"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005626461 from monmap...
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626461"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing monitor np0005626461 from monmap...
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports []
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports []
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@2(peon) e12  my rank is now 1 (was 2)
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: client.44327 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: paxos.1).electionLogic(50) init, last seen epoch 50
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: client.27096 ms_handle_reset on v2:172.18.0.105:3300/0
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.738 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:35.739 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:36 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:36 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='client.27159 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626461"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626461
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463'])
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626461 from monmap...
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626461"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports []
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 is new leader, mons np0005626466,np0005626463 in quorum (ranks 0,1)
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: monmap epoch 12
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:48:35.633872+0000
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 29s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:48:37 np0005626463.localdomain sudo[301387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:48:37 np0005626463.localdomain sudo[301387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:37 np0005626463.localdomain sudo[301387]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:37 np0005626463.localdomain sudo[301405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:48:37 np0005626463.localdomain sudo[301405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:37 np0005626463.localdomain sudo[301405]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:37 np0005626463.localdomain sudo[301423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:37 np0005626463.localdomain sudo[301423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:37 np0005626463.localdomain sudo[301423]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:37 np0005626463.localdomain sudo[301441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:37 np0005626463.localdomain sudo[301441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:37 np0005626463.localdomain sudo[301441]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:48:38 np0005626463.localdomain sudo[301459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:48:38 np0005626463.localdomain sudo[301459]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301493]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301511]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain sudo[301529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain sudo[301529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301529]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain sudo[301547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:38 np0005626463.localdomain sudo[301547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301547]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:48:38 np0005626463.localdomain sudo[301565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301565]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301583]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:38 np0005626463.localdomain sudo[301601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301601]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301619]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44407 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Removed label mon from host np0005626461.localdomain
Feb 23 09:48:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label mon from host np0005626461.localdomain
Feb 23 09:48:38 np0005626463.localdomain sudo[301653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301653]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:38 np0005626463.localdomain sudo[301671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:48:38 np0005626463.localdomain sudo[301671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:38 np0005626463.localdomain sudo[301671]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:39 np0005626463.localdomain sudo[301689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:48:39 np0005626463.localdomain sudo[301689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:39 np0005626463.localdomain sudo[301689]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4))
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e12  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: paxos.1).electionLogic(52) init, last seen epoch 52
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:39 np0005626463.localdomain sudo[301707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:48:39 np0005626463.localdomain sudo[301707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:39 np0005626463.localdomain sudo[301707]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:48:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:48:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:48:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:48:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18279 "" "Go-http-client/1.1"
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:39 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.775 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:40.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:40 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:40 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:48:40 np0005626463.localdomain podman[301725]: 2026-02-23 09:48:40.903138739 +0000 UTC m=+0.064809703 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:48:40 np0005626463.localdomain podman[301725]: 2026-02-23 09:48:40.916512726 +0000 UTC m=+0.078183720 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:48:40 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:48:41 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:41 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:41 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:42 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:42 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:43 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:48:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:48:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:48:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:48:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:48:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:48:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:48:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:48:43 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:43 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument
Feb 23 09:48:43 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: paxos.1).electionLogic(53) init, last seen epoch 53, mid-election, bumping
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)...
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 is new leader, mons np0005626466,np0005626463,np0005626465 in quorum (ranks 0,1,2)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: monmap epoch 13
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:48:39.200687+0000
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 36s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain sshd[301749]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Removed label mgr from host np0005626461.localdomain
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005626461.localdomain
Feb 23 09:48:44 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:48:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:48:44 np0005626463.localdomain podman[301751]: 2026-02-23 09:48:44.904537702 +0000 UTC m=+0.080590004 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter)
Feb 23 09:48:44 np0005626463.localdomain podman[301751]: 2026-02-23 09:48:44.919553649 +0000 UTC m=+0.095605901 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, name=ubi9/ubi-minimal, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Feb 23 09:48:44 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain sshd[301749]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: Removed label mgr from host np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34494 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.777 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.780 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_report got status from non-daemon mon.np0005626465
Feb 23 09:48:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:45.798+0000 7f44258e4640 -1 mgr.server handle_report got status from non-daemon mon.np0005626465
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:45.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Removed label _admin from host np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005626461.localdomain
Feb 23 09:48:45 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0)
Feb 23 09:48:46 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:48:46 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:46 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:48:46 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:48:46 np0005626463.localdomain sudo[301771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:46 np0005626463.localdomain sudo[301771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:46 np0005626463.localdomain sudo[301771]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:46 np0005626463.localdomain sudo[301789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:46 np0005626463.localdomain sudo[301789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.566269) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126566315, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 776, "num_deletes": 251, "total_data_size": 834042, "memory_usage": 850072, "flush_reason": "Manual Compaction"}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126572713, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 525456, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17137, "largest_seqno": 17908, "table_properties": {"data_size": 521724, "index_size": 1459, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9626, "raw_average_key_size": 19, "raw_value_size": 513473, "raw_average_value_size": 1063, "num_data_blocks": 58, "num_entries": 483, "num_filter_entries": 483, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840104, "oldest_key_time": 1771840104, "file_creation_time": 1771840126, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 6482 microseconds, and 2569 cpu microseconds.
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.572750) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 525456 bytes OK
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.572772) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574652) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574672) EVENT_LOG_v1 {"time_micros": 1771840126574667, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 829715, prev total WAL file size 829715, number of live WAL files 2.
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.575316) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323733' seq:72057594037927935, type:22 .. '6B760031353235' seq:0, type:0; will stop at (end)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(513KB)], [27(17MB)]
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126575353, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18983838, "oldest_snapshot_seqno": -1}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11078 keys, 17918100 bytes, temperature: kUnknown
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126666190, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 17918100, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17853249, "index_size": 36078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27717, "raw_key_size": 298828, "raw_average_key_size": 26, "raw_value_size": 17662182, "raw_average_value_size": 1594, "num_data_blocks": 1367, "num_entries": 11078, "num_filter_entries": 11078, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840126, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.666500) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 17918100 bytes
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.668373) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.8 rd, 197.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 17.6 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(70.2) write-amplify(34.1) OK, records in: 11614, records dropped: 536 output_compression: NoCompression
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.668405) EVENT_LOG_v1 {"time_micros": 1771840126668390, "job": 14, "event": "compaction_finished", "compaction_time_micros": 90927, "compaction_time_cpu_micros": 35993, "output_level": 6, "num_output_files": 1, "total_output_size": 17918100, "num_input_records": 11614, "num_output_records": 11078, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126668926, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126671588, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.575218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.819544695 +0000 UTC m=+0.072197267 container create 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, release=1770267347, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True)
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)...
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='client.34494 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: Removed label _admin from host np0005626461.localdomain
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:46 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:46 np0005626463.localdomain systemd[1]: Started libpod-conmon-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope.
Feb 23 09:48:46 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.790631355 +0000 UTC m=+0.043283957 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.899352855 +0000 UTC m=+0.152005437 container init 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7)
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.910174764 +0000 UTC m=+0.162827336 container start 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.91071504 +0000 UTC m=+0.163367672 container attach 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, distribution-scope=public, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:48:46 np0005626463.localdomain friendly_benz[301841]: 167 167
Feb 23 09:48:46 np0005626463.localdomain systemd[1]: libpod-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope: Deactivated successfully.
Feb 23 09:48:46 np0005626463.localdomain podman[301824]: 2026-02-23 09:48:46.912765353 +0000 UTC m=+0.165417965 container died 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, ceph=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph)
Feb 23 09:48:47 np0005626463.localdomain podman[301846]: 2026-02-23 09:48:47.046946907 +0000 UTC m=+0.121620993 container remove 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7)
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: libpod-conmon-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope: Deactivated successfully.
Feb 23 09:48:47 np0005626463.localdomain sudo[301789]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:47 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Feb 23 09:48:47 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:47 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:48:47 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:48:47 np0005626463.localdomain sudo[301863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:47 np0005626463.localdomain sudo[301863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:47 np0005626463.localdomain sudo[301863]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:47 np0005626463.localdomain sudo[301881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:47 np0005626463.localdomain sudo[301881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.747270591 +0000 UTC m=+0.071761775 container create 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: Started libpod-conmon-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope.
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.804978127 +0000 UTC m=+0.129469311 container init 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main)
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.814843698 +0000 UTC m=+0.139334892 container start 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, version=7)
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.815131526 +0000 UTC m=+0.139622750 container attach 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z)
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.718230717 +0000 UTC m=+0.042721911 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:47 np0005626463.localdomain happy_jepsen[301930]: 167 167
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: libpod-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope: Deactivated successfully.
Feb 23 09:48:47 np0005626463.localdomain podman[301915]: 2026-02-23 09:48:47.820028465 +0000 UTC m=+0.144519669 container died 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6ccc8cd821e0e77de2133330dd6e4ebb587fdbc52d1d1bb0b6c7f2ddb14a27de-merged.mount: Deactivated successfully.
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:48:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:47 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: tmp-crun.nZg3t9.mount: Deactivated successfully.
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f924e07e34d7fc902ec0d8619ac4a9040361719cf6ecd2af6fe513a29e2ad9aa-merged.mount: Deactivated successfully.
Feb 23 09:48:47 np0005626463.localdomain podman[301935]: 2026-02-23 09:48:47.924300509 +0000 UTC m=+0.088678851 container remove 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, distribution-scope=public, io.openshift.expose-services=, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:48:47 np0005626463.localdomain systemd[1]: libpod-conmon-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope: Deactivated successfully.
Feb 23 09:48:48 np0005626463.localdomain sudo[301881]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:48 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Feb 23 09:48:48 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:48 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:48:48 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:48:48 np0005626463.localdomain sudo[301958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:48 np0005626463.localdomain sudo[301958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:48 np0005626463.localdomain sudo[301958]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:48 np0005626463.localdomain sudo[301976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:48 np0005626463.localdomain sudo[301976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:48:48.551 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:48:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:48:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:48:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:48:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.713372623 +0000 UTC m=+0.074207809 container create 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:48:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope.
Feb 23 09:48:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.780007911 +0000 UTC m=+0.140843107 container init 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, vcs-type=git, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.684820505 +0000 UTC m=+0.045655741 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.789119359 +0000 UTC m=+0.149954555 container start 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1770267347, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main)
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.789350955 +0000 UTC m=+0.150186191 container attach 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, name=rhceph, version=7, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, CEPH_POINT_RELEASE=, release=1770267347)
Feb 23 09:48:48 np0005626463.localdomain serene_goldwasser[302025]: 167 167
Feb 23 09:48:48 np0005626463.localdomain systemd[1]: libpod-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope: Deactivated successfully.
Feb 23 09:48:48 np0005626463.localdomain podman[302010]: 2026-02-23 09:48:48.791530902 +0000 UTC m=+0.152366118 container died 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:48:48 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-23c6762863234f041e65f11cb4d3527d94b7f66d53f48e5e21b871f6dec6e3ea-merged.mount: Deactivated successfully.
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:48:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:48 np0005626463.localdomain podman[302030]: 2026-02-23 09:48:48.885000697 +0000 UTC m=+0.085136902 container remove 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:48:48 np0005626463.localdomain systemd[1]: libpod-conmon-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope: Deactivated successfully.
Feb 23 09:48:49 np0005626463.localdomain sudo[301976]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:48:49 np0005626463.localdomain sudo[302053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:49 np0005626463.localdomain sudo[302053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:49 np0005626463.localdomain sudo[302053]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:49 np0005626463.localdomain sudo[302071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:49 np0005626463.localdomain sudo[302071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.682784807 +0000 UTC m=+0.082925004 container create f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, version=7, io.buildah.version=1.42.2, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:48:49 np0005626463.localdomain systemd[1]: Started libpod-conmon-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope.
Feb 23 09:48:49 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.743501355 +0000 UTC m=+0.143641542 container init f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.646367609 +0000 UTC m=+0.046507826 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.752114437 +0000 UTC m=+0.152254634 container start f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.752408756 +0000 UTC m=+0.152548983 container attach f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, architecture=x86_64, GIT_CLEAN=True, version=7, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:48:49 np0005626463.localdomain nervous_pare[302120]: 167 167
Feb 23 09:48:49 np0005626463.localdomain systemd[1]: libpod-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope: Deactivated successfully.
Feb 23 09:48:49 np0005626463.localdomain podman[302105]: 2026-02-23 09:48:49.755188061 +0000 UTC m=+0.155328278 container died f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2)
Feb 23 09:48:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-fd6e93a01c30011113dbcf5b715246af0fcf1806fa795e42fe054e804dfb8762-merged.mount: Deactivated successfully.
Feb 23 09:48:49 np0005626463.localdomain podman[302125]: 2026-02-23 09:48:49.849016867 +0000 UTC m=+0.084937117 container remove f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-02-09T10:25:24Z)
Feb 23 09:48:49 np0005626463.localdomain systemd[1]: libpod-conmon-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope: Deactivated successfully.
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:49 np0005626463.localdomain sudo[302071]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:48:49 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:48:50 np0005626463.localdomain sudo[302142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:50 np0005626463.localdomain sudo[302142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:50 np0005626463.localdomain sudo[302142]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:50 np0005626463.localdomain sudo[302160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:50 np0005626463.localdomain sudo[302160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.582640895 +0000 UTC m=+0.080467740 container create 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, architecture=x86_64, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:48:50 np0005626463.localdomain systemd[1]: Started libpod-conmon-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope.
Feb 23 09:48:50 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.542828512 +0000 UTC m=+0.040655417 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.645508598 +0000 UTC m=+0.143335443 container init 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347)
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.654643556 +0000 UTC m=+0.152470401 container start 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.654899733 +0000 UTC m=+0.152726588 container attach 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:48:50 np0005626463.localdomain modest_lehmann[302210]: 167 167
Feb 23 09:48:50 np0005626463.localdomain systemd[1]: libpod-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope: Deactivated successfully.
Feb 23 09:48:50 np0005626463.localdomain podman[302195]: 2026-02-23 09:48:50.658331718 +0000 UTC m=+0.156158573 container died 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, distribution-scope=public, release=1770267347, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main)
Feb 23 09:48:50 np0005626463.localdomain podman[302215]: 2026-02-23 09:48:50.750276046 +0000 UTC m=+0.083652356 container remove 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=)
Feb 23 09:48:50 np0005626463.localdomain systemd[1]: libpod-conmon-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope: Deactivated successfully.
Feb 23 09:48:50 np0005626463.localdomain sudo[302160]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.823 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.824 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.824 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e588d0526933e2b45d6bb4ed1c5c80351e4c69b75a09351cd1fbffcc2bbb3e6f-merged.mount: Deactivated successfully.
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.860 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:50.860 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:50 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:50 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:50 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:50 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:50 np0005626463.localdomain sudo[302232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:48:50 np0005626463.localdomain sudo[302232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:50 np0005626463.localdomain sudo[302232]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:51 np0005626463.localdomain sudo[302250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:48:51 np0005626463.localdomain sudo[302250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.489503424 +0000 UTC m=+0.074508098 container create 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main)
Feb 23 09:48:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope.
Feb 23 09:48:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.552053059 +0000 UTC m=+0.137057733 container init 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, name=rhceph, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z)
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.460022078 +0000 UTC m=+0.045026812 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:48:51 np0005626463.localdomain blissful_napier[302299]: 167 167
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.56425311 +0000 UTC m=+0.149257784 container start 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, release=1770267347, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:48:51 np0005626463.localdomain systemd[1]: libpod-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope: Deactivated successfully.
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.564546658 +0000 UTC m=+0.149551372 container attach 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:48:51 np0005626463.localdomain podman[302284]: 2026-02-23 09:48:51.566694564 +0000 UTC m=+0.151699288 container died 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Feb 23 09:48:51 np0005626463.localdomain podman[302304]: 2026-02-23 09:48:51.658820248 +0000 UTC m=+0.085019299 container remove 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, release=1770267347, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 23 09:48:51 np0005626463.localdomain systemd[1]: libpod-conmon-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope: Deactivated successfully.
Feb 23 09:48:51 np0005626463.localdomain sudo[302250]: pam_unix(sudo:session): session closed for user root
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:48:51 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:48:51 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:51 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:48:51 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:48:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8dd0e2f6588fd66eb7098c0550d966c9f1413363904e613588fc090ca2842de3-merged.mount: Deactivated successfully.
Feb 23 09:48:51 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:52 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Feb 23 09:48:52 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:48:52 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:48:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:53 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:48:53 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:53 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:48:53 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:48:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:48:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:48:53 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:53 np0005626463.localdomain podman[302322]: 2026-02-23 09:48:53.912691454 +0000 UTC m=+0.083517752 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:48:53 np0005626463.localdomain podman[302321]: 2026-02-23 09:48:53.960889732 +0000 UTC m=+0.132811714 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 23 09:48:53 np0005626463.localdomain podman[302322]: 2026-02-23 09:48:53.979360644 +0000 UTC m=+0.150186892 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:48:54 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:48:54 np0005626463.localdomain podman[302321]: 2026-02-23 09:48:54.036262576 +0000 UTC m=+0.208184588 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 23 09:48:54 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:54 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:48:54 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:54 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:48:54 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:48:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:55 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:48:55 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:48:55 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:48:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.861 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.863 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.863 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.864 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.886 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:48:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:48:55.887 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:48:55 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.139 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf06184-990d-42d6-91aa-84651e9a4ebe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.135834', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'dea3a4a2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '2d7a93e8672f7cdb8c697f12458572b096a4b0c06cc52b64c3a11f6f75f808a7'}]}, 'timestamp': '2026-02-23 09:48:56.140827', '_unique_id': 'ffea7ba7d9f0457c8f4a72e8584aee02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.172 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2930937-7dfb-4424-ab39-634d90162cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.143809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea8a1b4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'd5ecb5de55d57cdc9bd3051fc6548088495f98215baeddf619a23a4a65fae245'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.143809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dea8b758-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'd3da68b32dcc75bab832ade7d8c58b6eb9b5f8f107c126c66c34ce2cdcc4086b'}]}, 'timestamp': '2026-02-23 09:48:56.174085', '_unique_id': 'fa43ed2e167042c6a42e38a4ae1f3514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2420f8cf-0458-4d09-b7d2-9f6377ed260e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.176609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea936a6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'c062af07a614954ab17cae31513d50bfba31755d49b2a3337df5c297119f4d8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.176609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dea94fc4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '5a7d974109475c362934bdc5a1e279bce5e53d0d090b7beaf16f0e3ca350fada'}]}, 'timestamp': '2026-02-23 09:48:56.178009', '_unique_id': '34ac02308f6a49fbb316689bfa09bc7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca092a8f-8484-417d-b8e3-53b86ef3fa92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.181476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea9f276-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '74da2ad9d45711689df562f8d9321bfbf84f18bb03bff63a330f7b85a526f541'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.181476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaa0c16-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '77f3adbc6f5a30f29376e0d78d8fff13fae4c896feb25ee1ba41e8f354230eba'}]}, 'timestamp': '2026-02-23 09:48:56.182792', '_unique_id': '3425f55ebbc743acb6d7db07c0d2c017'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 11460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8ced9ae-7b97-4f2f-8000-7ff1bf081c8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11460000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:48:56.186043', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'deaceef4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.390507412, 'message_signature': '6c56c548a1c1aa03784069f163f22e2f8c19ece221f5f46c47fc4471fd97b629'}]}, 'timestamp': '2026-02-23 09:48:56.201680', '_unique_id': '7ca5d72f653d4a2e9e3f5034a68280a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84424edf-4d17-49d6-91bb-f75c9b4099a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.204053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deaee470-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5fdd84bacd818eba76300a7e6474a4970dbfe076449aabc49f252360e281d192'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.204053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaef8b6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '37c41befd37b852505e8248b719e58904a7177dcdcb79d3b26316e1989690ed1'}]}, 'timestamp': '2026-02-23 09:48:56.215058', '_unique_id': 'f17144799db848c1ae7caff4cdb9fdf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40cac3ce-43ae-4711-9617-e6f57a70b1ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.217951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deaf83b2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '3a8fd82d3941757ab172ff835d5fe895c9b581c26cdbe696e95f5fcf5b30712a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.217951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaf9d84-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'b7dfcf078219e71767f99f4943d3f3475ab5796db8e0b6e83388104ff44abf36'}]}, 'timestamp': '2026-02-23 09:48:56.219286', '_unique_id': '7ff708b6b841499a89a9e8eae8666d07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe7bd427-e7ce-4c18-8457-2364b59ea463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.222510', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb035a0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'a344fb1639bdf2a21184a2b8363132fbaecef784b2e03f3f54dd521ffdc2fd03'}]}, 'timestamp': '2026-02-23 09:48:56.223253', '_unique_id': '874597120c3f4ba5af5873d2b99c1510'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032b8d73-3d8a-4e94-a98f-3efef62b46c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.226948', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb0e34c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'babc36d92ffdd855cc36dc3e96a490a3e10ad18702edae29b70791ecfcf5d16f'}]}, 'timestamp': '2026-02-23 09:48:56.227660', '_unique_id': 'e0193ffa4a6c4423841cdc5797134d85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511374e9-230b-4e2e-9a2d-6df7db48fd1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.230807', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb17b4a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '8bb29df29ceaf3d3d5dc6ec32c2c424478b65118a119d61362a23de3c74116b6'}]}, 'timestamp': '2026-02-23 09:48:56.231564', '_unique_id': '9be6122fc22245bb9439fe5ece5faa18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b215bbda-0a99-4aa1-adf9-6662599d0f59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.234151', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb1f764-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'ced36dd701c1777e2835810b10449d5f8209aab7fde83d1ab7e5753f672aa881'}]}, 'timestamp': '2026-02-23 09:48:56.234610', '_unique_id': '10431a8ca0034d91adc5fdbb99aa77fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0aee4ae-814d-44ea-b411-b5cd348354a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:48:56.236689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'deb25bdc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.390507412, 'message_signature': '3add3044e49ea38ef6b07d025d4766932053d9622e1467b48d537e2d495df2d1'}]}, 'timestamp': '2026-02-23 09:48:56.237168', '_unique_id': '2f8129d4c1c74d3f9bca9784e6499712'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ca3a54d-288d-44ff-9511-c5199b935bdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.239216', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb2bd0c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '822b8657a7059213ceb214dd6f4c791497a9071fe5b1b72d3bee14cbb5e955cc'}]}, 'timestamp': '2026-02-23 09:48:56.239667', '_unique_id': 'da5cea82699f463fb8807dba6b067189'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '813e4f2b-8a0d-4b25-9ef3-f9ccc8452650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.241742', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb320ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '79a66b048496b66f75440c120fbaafef2c4ae2b93557beed97972387ec0dc9fd'}]}, 'timestamp': '2026-02-23 09:48:56.242223', '_unique_id': '3622dff546ba4a24b219f1116956733b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16ee4dda-4bc7-4ac4-9045-5748fc6f44d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.244270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb38232-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '25c6c4ccd063ca0cdd4a4b5df70f23e7a57543428e327580a92634d2664ef14c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.244270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb393ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '1ac6bc53a4ae0c30f78c58f0e45c1589a533c27e3e8e4800d0d7beb87037d764'}]}, 'timestamp': '2026-02-23 09:48:56.245149', '_unique_id': '8d7408caab1741d393e5006ef7c21a47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5529999e-8462-440b-a039-9b42fbbedebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.247244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb3f654-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5b9d3bc5c193b73d90b19a85ec775bf1d620ab4f070518548d406f6e2c120617'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.247244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb40612-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5ee3e4bcd7b04b2b4ae3815953566c071c0e75d4145e6c6fefc67d73b92777cd'}]}, 'timestamp': '2026-02-23 09:48:56.248092', '_unique_id': 'd3b0dedb577f4423b2fd4cf4ab8b2510'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e42feefc-e886-4b12-a8ba-183ab81b6bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.250187', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb4663e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'eaab39345bc7cf8cb0987e44b5218677f4caa77a27da219b2849d947503afa1d'}]}, 'timestamp': '2026-02-23 09:48:56.250465', '_unique_id': 'be120899a7334b33be2722675d494655'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e973ccf8-aa3c-4bf2-a1dc-865f234fdd8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.251725', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb4a248-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '57dd8c40e9d4595bcc6c9a6ec4a2c9164dd27d9cd5394c27b90227a65abd8224'}]}, 'timestamp': '2026-02-23 09:48:56.252024', '_unique_id': 'e11f300322084809bbf1b8e9b369abac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fbc0759-e266-4781-97ad-9171e2f17ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.253297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb4df9c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'a3456c5ba8ee2335265899029df5de46b9ee3063f678424a69ccef4560e707dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.253297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb4e988-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'a5a9198cc70f95e475c9bf7e35a4d024e6c45d264d36a6b7d83a4b601fab2df5'}]}, 'timestamp': '2026-02-23 09:48:56.253807', '_unique_id': '1cae021d530b474f88695916d301cfb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8d72495-cdc4-43c3-a22b-2d857731d773', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.255123', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb52718-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'ffb12ca417bf20bdba08189a2ca89835b2b80e6670f8bb8b04f9c442367be499'}]}, 'timestamp': '2026-02-23 09:48:56.255402', '_unique_id': '1c1d3d7bad784b6da98c079f598833eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed0de983-4df3-4f75-95df-6aac5198b483', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.256647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb5626e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': 'fb467dc5bad82ccf63064900f4ec01b2464c7a56403b022a5a504b49b288a334'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.256647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb56f7a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '7474be0551220ae1943323e7a8ceedc71974ffb39edee3ab119eb8a7aa4c8eed'}]}, 'timestamp': '2026-02-23 09:48:56.257238', '_unique_id': '00ab5926e34a45968066099aff9e6d39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:48:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:56 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:48:56 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:56 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:48:56 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:48:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34500 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626461.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Added label _no_schedule to host np0005626461.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005626461.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain
Feb 23 09:48:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:57 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:57 np0005626463.localdomain systemd[1]: tmp-crun.6rclLq.mount: Deactivated successfully.
Feb 23 09:48:57 np0005626463.localdomain podman[302370]: 2026-02-23 09:48:57.914789138 +0000 UTC m=+0.089484985 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 09:48:57 np0005626463.localdomain podman[302370]: 2026-02-23 09:48:57.929288809 +0000 UTC m=+0.103984656 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Feb 23 09:48:57 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:58 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Feb 23 09:48:58 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:58 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:48:58 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:48:58 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34506 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626461.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: from='client.34500 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626461.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: Added label _no_schedule to host np0005626461.localdomain
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:48:59 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Feb 23 09:48:59 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:48:59 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:48:59 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:48:59 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:48:59 np0005626463.localdomain sshd[302390]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:48:59 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44431 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626461.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Removed host np0005626461.localdomain
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed host np0005626461.localdomain
Feb 23 09:49:00 np0005626463.localdomain sshd[302390]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='client.34506 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626461.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"}]': finished
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:49:00 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:00.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:49:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5308 writes, 23K keys, 5308 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5308 writes, 741 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 156 writes, 536 keys, 156 commit groups, 1.0 writes per commit group, ingest: 0.64 MB, 0.00 MB/s
                                                          Interval WAL: 156 writes, 62 syncs, 2.52 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:01 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:01 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:01 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='client.44431 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626461.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: Removed host np0005626461.localdomain
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:49:01 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:01 np0005626463.localdomain podman[302392]: 2026-02-23 09:49:01.9093399 +0000 UTC m=+0.080395848 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 23 09:49:01 np0005626463.localdomain podman[302392]: 2026-02-23 09:49:01.943236531 +0000 UTC m=+0.114292439 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 23 09:49:01 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:02 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:49:02 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:02 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:02 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:49:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:03 np0005626463.localdomain sudo[302410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:49:03 np0005626463.localdomain sudo[302410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:03 np0005626463.localdomain sudo[302410]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:03 np0005626463.localdomain sudo[302428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:03 np0005626463.localdomain sudo[302428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:03 np0005626463.localdomain sudo[302428]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:03 np0005626463.localdomain sudo[302446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:03 np0005626463.localdomain sudo[302446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:03 np0005626463.localdomain sudo[302446]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:03 np0005626463.localdomain sudo[302464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:03 np0005626463.localdomain sudo[302464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:03 np0005626463.localdomain sudo[302464]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:03 np0005626463.localdomain sudo[302482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:03 np0005626463.localdomain sudo[302482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:03 np0005626463.localdomain sudo[302482]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:03 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:04 np0005626463.localdomain sudo[302516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302516]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302534]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain sudo[302552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain sudo[302552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302552]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain sudo[302570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:04 np0005626463.localdomain sudo[302570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302570]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:04 np0005626463.localdomain sudo[302588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302588]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302606]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:04 np0005626463.localdomain sudo[302624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302624]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302642]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3235845437' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3235845437' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:49:04 np0005626463.localdomain sudo[302676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302676]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain sudo[302694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:04 np0005626463.localdomain sudo[302694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain sudo[302694]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:04 np0005626463.localdomain sudo[302712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:04 np0005626463.localdomain sudo[302712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:04 np0005626463.localdomain sudo[302712]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:04 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:49:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:05 np0005626463.localdomain sudo[302730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:49:05 np0005626463.localdomain sudo[302730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:05 np0005626463.localdomain sudo[302730]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:05 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.932 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.932 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.973 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:05.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:49:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5568 writes, 24K keys, 5568 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5568 writes, 778 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 147 writes, 320 keys, 147 commit groups, 1.0 writes per commit group, ingest: 0.43 MB, 0.00 MB/s
                                                          Interval WAL: 147 writes, 73 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:49:06 np0005626463.localdomain ceph-mon[294160]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Optimize plan auto_2026-02-23_09:49:07
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] do_upmap
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] pools ['manila_data', 'manila_metadata', '.mgr', 'backups', 'vms', 'volumes', 'images']
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:07 np0005626463.localdomain ceph-mgr[288036]: [balancer INFO root] prepared 0/10 changes
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] _maybe_adjust
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033329080297319932 of space, bias 1.0, pg target 0.6665816059463986 quantized to 32 (current 32)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44449 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Saving service mon spec with placement label:mon
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:08 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:08 np0005626463.localdomain sudo[302748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:49:08 np0005626463.localdomain sudo[302748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:08 np0005626463.localdomain sudo[302748]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:09 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:49:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:49:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:49:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:49:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:49:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:49:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:49:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18271 "" "Go-http-client/1.1"
Feb 23 09:49:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626466", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:49:09 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:09 np0005626463.localdomain ceph-mon[294160]: from='client.44449 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:09 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:49:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:10 np0005626463.localdomain ceph-mon[294160]: from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626466", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:49:10 np0005626463.localdomain ceph-mon[294160]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:10.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:10.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:10.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:10.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54134 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626466"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Remove daemons mon.np0005626466
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005626466
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465'])
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465'])
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005626466 from monmap...
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing monitor np0005626466 from monmap...
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports []
Feb 23 09:49:10 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports []
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: client.27096 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: client.44327 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mon rm", "name": "np0005626466"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626466"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@1(peon) e14  my rank is now 0 (was 1)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:11.073 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:11.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_auth_request failed to assign global_id
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: paxos.0).electionLogic(56) init, last seen epoch 56
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 14
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:10.990173+0000
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 63s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: Remove daemons mon.np0005626466
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465'])
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: Removing monitor np0005626466 from monmap...
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626466"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports []
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: monmap epoch 14
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:49:10.990173+0000
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 63s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/553441531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:49:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:11 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:11 np0005626463.localdomain podman[302766]: 2026-02-23 09:49:11.911372662 +0000 UTC m=+0.085571925 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:49:11 np0005626463.localdomain podman[302766]: 2026-02-23 09:49:11.925284596 +0000 UTC m=+0.099483869 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:49:11 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:49:11 np0005626463.localdomain sudo[302782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:49:11 np0005626463.localdomain sudo[302782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:11 np0005626463.localdomain sudo[302782]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:49:12 np0005626463.localdomain sudo[302807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302807]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[302825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302825]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:12 np0005626463.localdomain sudo[302843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[302861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302861]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/553441531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:12 np0005626463.localdomain sudo[302895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[302895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302895]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[302913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302913]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain sudo[302931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302931]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain sudo[302949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:12 np0005626463.localdomain sudo[302949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302949]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[302967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:12 np0005626463.localdomain sudo[302967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302967]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:49:12 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3509329788' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:12 np0005626463.localdomain sudo[302985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[302985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[302985]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain sudo[303003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:12 np0005626463.localdomain sudo[303003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[303003]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:12 np0005626463.localdomain sudo[303021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:12 np0005626463.localdomain sudo[303021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:12 np0005626463.localdomain sudo[303021]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain sudo[303055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:13 np0005626463.localdomain sudo[303055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:13 np0005626463.localdomain sudo[303055]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain sudo[303073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:13 np0005626463.localdomain sudo[303073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:13 np0005626463.localdomain sudo[303073]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain sudo[303091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:13 np0005626463.localdomain sudo[303091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:13 np0005626463.localdomain sudo[303091]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3509329788' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:49:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:49:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:49:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:49:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:49:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:13 np0005626463.localdomain sudo[303109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:49:13 np0005626463.localdomain sudo[303109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:13 np0005626463.localdomain sudo[303109]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:13 np0005626463.localdomain sudo[303127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:13 np0005626463.localdomain sudo[303127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:13 np0005626463.localdomain sudo[303127]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:13 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:13 np0005626463.localdomain sudo[303145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:13 np0005626463.localdomain sudo[303145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.067 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:49:14 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1345472982' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.426544001 +0000 UTC m=+0.080205323 container create ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64, vcs-type=git, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 23 09:49:14 np0005626463.localdomain systemd[1]: Started libpod-conmon-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope.
Feb 23 09:49:14 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.392248497 +0000 UTC m=+0.045909839 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.494081176 +0000 UTC m=+0.147742508 container init ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main)
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.503411861 +0000 UTC m=+0.157073183 container start ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.503754451 +0000 UTC m=+0.157415813 container attach ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, distribution-scope=public, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:14 np0005626463.localdomain stoic_rubin[303195]: 167 167
Feb 23 09:49:14 np0005626463.localdomain systemd[1]: libpod-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope: Deactivated successfully.
Feb 23 09:49:14 np0005626463.localdomain podman[303179]: 2026-02-23 09:49:14.506713601 +0000 UTC m=+0.160374963 container died ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, vcs-type=git, release=1770267347, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:49:14 np0005626463.localdomain podman[303200]: 2026-02-23 09:49:14.61413383 +0000 UTC m=+0.091736593 container remove ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:49:14 np0005626463.localdomain systemd[1]: libpod-conmon-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope: Deactivated successfully.
Feb 23 09:49:14 np0005626463.localdomain sudo[303145]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:14 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:14 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:14 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:14 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:14 np0005626463.localdomain sudo[303217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:14 np0005626463.localdomain sudo[303217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:14 np0005626463.localdomain sudo[303217]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.833 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.833 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:49:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:14.834 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:49:14 np0005626463.localdomain sudo[303235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:14 np0005626463.localdomain sudo[303235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:15.185 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:49:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:15.199 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:49:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:15.199 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:49:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:15.200 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:49:15 np0005626463.localdomain podman[303267]: 2026-02-23 09:49:15.358218247 +0000 UTC m=+0.078051127 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/100647062' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:15 np0005626463.localdomain podman[303267]: 2026-02-23 09:49:15.374280205 +0000 UTC m=+0.094113065 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.424964337 +0000 UTC m=+0.123553991 container create 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8ab0d3838dc42168bb770c38c550942bcf158f406d168831e05d2ea78e499fb0-merged.mount: Deactivated successfully.
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: Started libpod-conmon-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope.
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.486102178 +0000 UTC m=+0.184691832 container init 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main)
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.394569093 +0000 UTC m=+0.093158797 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:15 np0005626463.localdomain trusting_wozniak[303305]: 167 167
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.494624877 +0000 UTC m=+0.193214531 container start 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7)
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: libpod-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope: Deactivated successfully.
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.494937017 +0000 UTC m=+0.193526721 container attach 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=)
Feb 23 09:49:15 np0005626463.localdomain podman[303275]: 2026-02-23 09:49:15.497179275 +0000 UTC m=+0.195768949 container died 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main)
Feb 23 09:49:15 np0005626463.localdomain podman[303310]: 2026-02-23 09:49:15.586794353 +0000 UTC m=+0.081815081 container remove 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, release=1770267347)
Feb 23 09:49:15 np0005626463.localdomain systemd[1]: libpod-conmon-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope: Deactivated successfully.
Feb 23 09:49:15 np0005626463.localdomain sudo[303235]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:15 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:15 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:15 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:15 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:15 np0005626463.localdomain sudo[303333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:15 np0005626463.localdomain sudo[303333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:15 np0005626463.localdomain sudo[303333]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:15 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:15 np0005626463.localdomain sudo[303351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:15 np0005626463.localdomain sudo[303351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:16.111 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 
Feb 23 09:49:16 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e289e261848ed4c1c2482eaf8df4286950606cb840cd763a5abd9c3a2de1513-merged.mount: Deactivated successfully.
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.434219514 +0000 UTC m=+0.085872665 container create cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:49:16 np0005626463.localdomain systemd[1]: Started libpod-conmon-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope.
Feb 23 09:49:16 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.394225446 +0000 UTC m=+0.045878597 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.502807301 +0000 UTC m=+0.154460452 container init cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container)
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.513576619 +0000 UTC m=+0.165229780 container start cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.513861688 +0000 UTC m=+0.165514849 container attach cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:16 np0005626463.localdomain blissful_greider[303401]: 167 167
Feb 23 09:49:16 np0005626463.localdomain podman[303386]: 2026-02-23 09:49:16.516939391 +0000 UTC m=+0.168592562 container died cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:16 np0005626463.localdomain systemd[1]: libpod-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope: Deactivated successfully.
Feb 23 09:49:16 np0005626463.localdomain podman[303406]: 2026-02-23 09:49:16.609075506 +0000 UTC m=+0.081844272 container remove cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:49:16 np0005626463.localdomain systemd[1]: libpod-conmon-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope: Deactivated successfully.
Feb 23 09:49:16 np0005626463.localdomain sudo[303351]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:16 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:49:16 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:16 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:49:16 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:49:16 np0005626463.localdomain sudo[303429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:16 np0005626463.localdomain sudo[303429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:16 np0005626463.localdomain sudo[303429]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:16 np0005626463.localdomain sudo[303447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:16 np0005626463.localdomain sudo[303447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:17.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.421679907 +0000 UTC m=+0.077536680 container create b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: tmp-crun.I5r2fT.mount: Deactivated successfully.
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-8179127daad17bf965845a3875a822603381bc2f529d8f3cdc6afa4736d2b6f3-merged.mount: Deactivated successfully.
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: Started libpod-conmon-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope.
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.481650903 +0000 UTC m=+0.137507686 container init b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.391621653 +0000 UTC m=+0.047478466 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.491650027 +0000 UTC m=+0.147506810 container start b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.491923965 +0000 UTC m=+0.147780778 container attach b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, CEPH_POINT_RELEASE=, version=7, ceph=True, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1770267347, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:49:17 np0005626463.localdomain thirsty_nightingale[303497]: 167 167
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: libpod-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope: Deactivated successfully.
Feb 23 09:49:17 np0005626463.localdomain podman[303482]: 2026-02-23 09:49:17.494087101 +0000 UTC m=+0.149943874 container died b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64)
Feb 23 09:49:17 np0005626463.localdomain podman[303502]: 2026-02-23 09:49:17.591345341 +0000 UTC m=+0.085761111 container remove b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, release=1770267347, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main)
Feb 23 09:49:17 np0005626463.localdomain systemd[1]: libpod-conmon-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope: Deactivated successfully.
Feb 23 09:49:17 np0005626463.localdomain sudo[303447]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:49:17 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:49:17 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:49:17 np0005626463.localdomain sudo[303519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:17 np0005626463.localdomain sudo[303519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:17 np0005626463.localdomain sudo[303519]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:17 np0005626463.localdomain sudo[303537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:17 np0005626463.localdomain sudo[303537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:17 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:18.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:18.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:18.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:49:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:18.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.263379415 +0000 UTC m=+0.074288423 container create e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, release=1770267347, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Feb 23 09:49:18 np0005626463.localdomain systemd[1]: Started libpod-conmon-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope.
Feb 23 09:49:18 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.319813772 +0000 UTC m=+0.130722810 container init e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=1770267347)
Feb 23 09:49:18 np0005626463.localdomain suspicious_joliot[303586]: 167 167
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.330950301 +0000 UTC m=+0.141859329 container start e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.331555239 +0000 UTC m=+0.142464257 container attach e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, build-date=2026-02-09T10:25:24Z, release=1770267347, version=7, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.233091432 +0000 UTC m=+0.044000470 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:18 np0005626463.localdomain systemd[1]: libpod-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope: Deactivated successfully.
Feb 23 09:49:18 np0005626463.localdomain podman[303571]: 2026-02-23 09:49:18.333915761 +0000 UTC m=+0.144824769 container died e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, distribution-scope=public, name=rhceph, version=7, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 23 09:49:18 np0005626463.localdomain podman[303591]: 2026-02-23 09:49:18.426288532 +0000 UTC m=+0.079229062 container remove e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, GIT_BRANCH=main, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 23 09:49:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-89335beadee5d2918112b22abd8fd4f968d60c2807236c28273a7ecee2122fa4-merged.mount: Deactivated successfully.
Feb 23 09:49:18 np0005626463.localdomain systemd[1]: libpod-conmon-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope: Deactivated successfully.
Feb 23 09:49:18 np0005626463.localdomain sudo[303537]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:18 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:49:18 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:18 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:49:18 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.063 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.064 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.064 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:49:19 np0005626463.localdomain sshd[303608]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:19 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Feb 23 09:49:19 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:19 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:49:19 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.807 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.833 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.834 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.834 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:49:19 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:19.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:49:19 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.210 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.210 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.211 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.211 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.212 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:20 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Feb 23 09:49:20 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:20 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:49:20 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:49:20 np0005626463.localdomain sshd[303608]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4209324860' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.660 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.718 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.718 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.896 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11736MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:49:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:20 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/4209324860' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.040 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.041 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.041 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.095 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.112 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.137 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.159 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.160 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.183 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.210 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.247 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:21 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:49:21 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:21 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:49:21 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1220083815' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.699 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.708 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.730 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.733 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.733 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.734 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.734 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:49:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:21.753 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:49:21 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1220083815' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:22 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:49:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:49:22 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:49:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:22.753 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54153 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005626466.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:23 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:49:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:24 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Feb 23 09:49:24 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:24 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:49:24 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:49:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:49:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:49:24 np0005626463.localdomain podman[303654]: 2026-02-23 09:49:24.919291875 +0000 UTC m=+0.090379782 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 23 09:49:24 np0005626463.localdomain podman[303654]: 2026-02-23 09:49:24.957317922 +0000 UTC m=+0.128405829 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:49:24 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: from='client.54153 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005626466.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: Deploying daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:49:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:25 np0005626463.localdomain podman[303655]: 2026-02-23 09:49:24.962097848 +0000 UTC m=+0.131457272 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:49:25 np0005626463.localdomain podman[303655]: 2026-02-23 09:49:25.045567649 +0000 UTC m=+0.214927033 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:49:25 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:49:25 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:49:25 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.139 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:26.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 23 09:49:26 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).monmap v14 adding/updating np0005626466 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:26 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (2) No such file or directory
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: paxos.0).electionLogic(58) init, last seen epoch 58
Feb 23 09:49:26 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:27 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:27 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:27 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:49:28 np0005626463.localdomain podman[303703]: 2026-02-23 09:49:28.910462166 +0000 UTC m=+0.082758130 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute)
Feb 23 09:49:28 np0005626463.localdomain podman[303703]: 2026-02-23 09:49:28.922342138 +0000 UTC m=+0.094638102 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 09:49:28 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:28 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:28 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:49:29 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:29 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:29 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:30 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:30 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.179 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:31.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:31 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:31 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:31 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 15
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:26.924061+0000
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 84s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005626463,np0005626465 (MON_DOWN)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005626463,np0005626465
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005626463,np0005626465
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] :     mon.np0005626466 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: monmap epoch 15
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:49:26.924061+0000
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 84s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1/3 mons down, quorum np0005626463,np0005626465 (MON_DOWN)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005626463,np0005626465
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005626463,np0005626465
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]:     mon.np0005626466 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:49:32 np0005626463.localdomain systemd[1]: tmp-crun.nogxYS.mount: Deactivated successfully.
Feb 23 09:49:32 np0005626463.localdomain podman[303722]: 2026-02-23 09:49:32.907057481 +0000 UTC m=+0.078981044 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:32 np0005626463.localdomain podman[303722]: 2026-02-23 09:49:32.940347384 +0000 UTC m=+0.112271017 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)...
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:49:32 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain
Feb 23 09:49:32 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)...
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:33 np0005626463.localdomain ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: paxos.0).electionLogic(60) init, last seen epoch 60
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465,np0005626466 in quorum (ranks 0,1,2)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 15
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:26.924061+0000
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 86s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005626463,np0005626465)
Feb 23 09:49:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Cluster is now healthy
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:34 np0005626463.localdomain sudo[303740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:34 np0005626463.localdomain sudo[303740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:34 np0005626463.localdomain sudo[303740]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)...
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626466 calling monitor election
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 calling monitor election
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626465 calling monitor election
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465,np0005626466 in quorum (ranks 0,1,2)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: monmap epoch 15
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: last_changed 2026-02-23T09:49:26.924061+0000
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: created 2026-02-23T07:36:01.997603+0000
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: min_mon_release 18 (reef)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: election_strategy: 1
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: osdmap e84: 6 total, 6 up, 6 in
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mgrmap e32: np0005626463.wtksup(active, since 86s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005626463,np0005626465)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: Cluster is now healthy
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:34 np0005626463.localdomain sudo[303758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:49:34 np0005626463.localdomain sudo[303758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:34 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0)
Feb 23 09:49:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:35 np0005626463.localdomain sudo[303758]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:35 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:35 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:35.924+0000 7f44258e4640 -1 mgr.server handle_report got status from non-daemon mon.np0005626466
Feb 23 09:49:35 np0005626463.localdomain ceph-mgr[288036]: mgr.server handle_report got status from non-daemon mon.np0005626466
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.244 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:36.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:36 np0005626463.localdomain sudo[303807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:49:36 np0005626463.localdomain sudo[303807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303807]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain sudo[303825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:49:36 np0005626463.localdomain sudo[303825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303825]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain sudo[303843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:36 np0005626463.localdomain sudo[303843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303843]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain sudo[303861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:36 np0005626463.localdomain sudo[303861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303861]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain sudo[303879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:36 np0005626463.localdomain sudo[303879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303879]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:36 np0005626463.localdomain sudo[303913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:36 np0005626463.localdomain sudo[303913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303913]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:36 np0005626463.localdomain sudo[303931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:36 np0005626463.localdomain sudo[303931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:36 np0005626463.localdomain sudo[303931]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[303949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain sudo[303949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[303949]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain sudo[303967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:37 np0005626463.localdomain sudo[303967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[303967]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain sudo[303985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:37 np0005626463.localdomain sudo[303985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[303985]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:37 np0005626463.localdomain sudo[304003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304003]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:37 np0005626463.localdomain sudo[304021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304021]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:37 np0005626463.localdomain sudo[304039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304039]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:37 np0005626463.localdomain sudo[304073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304073]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:37 np0005626463.localdomain sudo[304091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304091]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain sudo[304109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain sudo[304109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304109]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] update: starting ev 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] complete: finished ev 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3))
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Completed event 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:37 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:37 np0005626463.localdomain sudo[304127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:49:37 np0005626463.localdomain sudo[304127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:37 np0005626463.localdomain sudo[304127]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: []
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f43fdf1bac0>)]
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections..
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f43fdf1b8b0>)]
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:38 np0005626463.localdomain sudo[304145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:38 np0005626463.localdomain sudo[304145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:38 np0005626463.localdomain sudo[304145]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:38 np0005626463.localdomain sudo[304163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:38 np0005626463.localdomain sudo[304163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.674968457 +0000 UTC m=+0.075425287 container create ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, distribution-scope=public, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z)
Feb 23 09:49:38 np0005626463.localdomain systemd[1]: Started libpod-conmon-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope.
Feb 23 09:49:38 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.644225651 +0000 UTC m=+0.044682511 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.747947278 +0000 UTC m=+0.148404108 container init ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.758651584 +0000 UTC m=+0.159108414 container start ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public)
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.758910161 +0000 UTC m=+0.159367001 container attach ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, release=1770267347, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:38 np0005626463.localdomain crazy_turing[304212]: 167 167
Feb 23 09:49:38 np0005626463.localdomain systemd[1]: libpod-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope: Deactivated successfully.
Feb 23 09:49:38 np0005626463.localdomain podman[304197]: 2026-02-23 09:49:38.761695286 +0000 UTC m=+0.162152146 container died ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.openshift.expose-services=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Feb 23 09:49:38 np0005626463.localdomain podman[304217]: 2026-02-23 09:49:38.872778927 +0000 UTC m=+0.092074783 container remove ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, io.buildah.version=1.42.2, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 23 09:49:38 np0005626463.localdomain systemd[1]: libpod-conmon-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope: Deactivated successfully.
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2088209685' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)...
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain
Feb 23 09:49:38 np0005626463.localdomain sudo[304163]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:38 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:39 np0005626463.localdomain sudo[304234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:39 np0005626463.localdomain sudo[304234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:39 np0005626463.localdomain sudo[304234]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:39 np0005626463.localdomain sudo[304252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:39 np0005626463.localdomain sudo[304252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:39 np0005626463.localdomain ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:39 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO root] Reconfig service osd.default_drive_group
Feb 23 09:49:39 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:49:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:49:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:49:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18278 "" "Go-http-client/1.1"
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.61347653 +0000 UTC m=+0.077772469 container create a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1770267347, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope.
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.58097838 +0000 UTC m=+0.045274369 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a06b2b25667b3e704ebf9634303fe6bc5679ded988ffc68f93a235fc21261233-merged.mount: Deactivated successfully.
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.682435199 +0000 UTC m=+0.146731138 container init a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, architecture=x86_64)
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.691154544 +0000 UTC m=+0.155450483 container start a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.openshift.expose-services=, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1770267347, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:39 np0005626463.localdomain recursing_swartz[304303]: 167 167
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.691587997 +0000 UTC m=+0.155884016 container attach a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, vcs-type=git)
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: libpod-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope: Deactivated successfully.
Feb 23 09:49:39 np0005626463.localdomain podman[304288]: 2026-02-23 09:49:39.695672741 +0000 UTC m=+0.159968720 container died a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f30df2981ef37d9c237946931441128736a9ce4c64e09201c00980f67e92e2c1-merged.mount: Deactivated successfully.
Feb 23 09:49:39 np0005626463.localdomain podman[304308]: 2026-02-23 09:49:39.792491608 +0000 UTC m=+0.088221266 container remove a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph)
Feb 23 09:49:39 np0005626463.localdomain systemd[1]: libpod-conmon-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope: Deactivated successfully.
Feb 23 09:49:39 np0005626463.localdomain ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)...
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:39 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:40 np0005626463.localdomain sudo[304252]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.034991) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180035055, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2166, "num_deletes": 252, "total_data_size": 3505561, "memory_usage": 3558216, "flush_reason": "Manual Compaction"}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180046154, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2398594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17913, "largest_seqno": 20074, "table_properties": {"data_size": 2389183, "index_size": 5596, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 24651, "raw_average_key_size": 22, "raw_value_size": 2368421, "raw_average_value_size": 2158, "num_data_blocks": 248, "num_entries": 1097, "num_filter_entries": 1097, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840126, "oldest_key_time": 1771840126, "file_creation_time": 1771840180, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 11207 microseconds, and 5889 cpu microseconds.
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.046203) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2398594 bytes OK
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.046226) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048570) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048607) EVENT_LOG_v1 {"time_micros": 1771840180048599, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048628) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3495127, prev total WAL file size 3511606, number of live WAL files 2.
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.049663) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2342KB)], [30(17MB)]
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180049723, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 20316694, "oldest_snapshot_seqno": -1}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11639 keys, 16330924 bytes, temperature: kUnknown
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180122652, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 16330924, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16262696, "index_size": 38047, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312649, "raw_average_key_size": 26, "raw_value_size": 16062192, "raw_average_value_size": 1380, "num_data_blocks": 1452, "num_entries": 11639, "num_filter_entries": 11639, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840180, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.122901) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 16330924 bytes
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.124449) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 278.4 rd, 223.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 17.1 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(15.3) write-amplify(6.8) OK, records in: 12175, records dropped: 536 output_compression: NoCompression
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.124467) EVENT_LOG_v1 {"time_micros": 1771840180124458, "job": 16, "event": "compaction_finished", "compaction_time_micros": 72982, "compaction_time_cpu_micros": 19274, "output_level": 6, "num_output_files": 1, "total_output_size": 16330924, "num_input_records": 12175, "num_output_records": 11639, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180124772, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180126550, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.049568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126787) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.127023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:40 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:40 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:40 np0005626463.localdomain ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:40 np0005626463.localdomain ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:40 np0005626463.localdomain sudo[304330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:40 np0005626463.localdomain sudo[304330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:40 np0005626463.localdomain sudo[304330]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:40 np0005626463.localdomain sudo[304348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:40 np0005626463.localdomain sudo[304348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e33: np0005626463.wtksup(active, since 92s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.793602527 +0000 UTC m=+0.074470347 container create dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph)
Feb 23 09:49:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope.
Feb 23 09:49:40 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.856257334 +0000 UTC m=+0.137125144 container init dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, vcs-type=git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.762081557 +0000 UTC m=+0.042949377 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:40 np0005626463.localdomain vigorous_lederberg[304398]: 167 167
Feb 23 09:49:40 np0005626463.localdomain systemd[1]: libpod-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope: Deactivated successfully.
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.869500036 +0000 UTC m=+0.150367846 container start dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.42.2, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.86994933 +0000 UTC m=+0.150817180 container attach dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 09:49:40 np0005626463.localdomain podman[304383]: 2026-02-23 09:49:40.872390324 +0000 UTC m=+0.153258134 container died dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 do_prune osdmap full prune enabled
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Activating manager daemon np0005626466.nisqfq
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 e85: 6 total, 6 up, 6 in
Feb 23 09:49:40 np0005626463.localdomain ceph-mgr[288036]: mgr handle_mgr_map I was active but no longer am
Feb 23 09:49:40 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:40.952+0000 7f4481af5640 -1 mgr handle_mgr_map I was active but no longer am
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e85: 6 total, 6 up, 6 in
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:49:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e34: np0005626466.nisqfq(active, starting, since 0.0428083s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:40 np0005626463.localdomain podman[304403]: 2026-02-23 09:49:40.968724857 +0000 UTC m=+0.091001241 container remove dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Feb 23 09:49:40 np0005626463.localdomain systemd[1]: libpod-conmon-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope: Deactivated successfully.
Feb 23 09:49:40 np0005626463.localdomain sshd[300031]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:49:40 np0005626463.localdomain systemd-logind[759]: Session 69 logged out. Waiting for processes to exit.
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Manager daemon np0005626466.nisqfq is now available
Feb 23 09:49:41 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: ignoring --setuser ceph since I am not root
Feb 23 09:49:41 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: ignoring --setgroup ceph since I am not root
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: pidfile_write: ignore empty --pid-file
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: Reconfig service osd.default_drive_group
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' 
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mgrmap e33: np0005626463.wtksup(active, since 92s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: Activating manager daemon np0005626466.nisqfq
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: osdmap e85: 6 total, 6 up, 6 in
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mgrmap e34: np0005626466.nisqfq(active, starting, since 0.0428083s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: Manager daemon np0005626466.nisqfq is now available
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} v 0)
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} v 0)
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'alerts'
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:41 np0005626463.localdomain sudo[304348]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:41 np0005626463.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Feb 23 09:49:41 np0005626463.localdomain systemd[1]: session-69.scope: Consumed 23.429s CPU time.
Feb 23 09:49:41 np0005626463.localdomain systemd-logind[759]: Removed session 69.
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'balancer'
Feb 23 09:49:41 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.151+0000 7f6fbef71140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} v 0)
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} v 0)
Feb 23 09:49:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'cephadm'
Feb 23 09:49:41 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.226+0000 7f6fbef71140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.285 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:41.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:41 np0005626463.localdomain sshd[304449]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:49:41 np0005626463.localdomain sshd[304449]: Accepted publickey for ceph-admin from 192.168.122.108 port 58944 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:49:41 np0005626463.localdomain systemd-logind[759]: New session 70 of user ceph-admin.
Feb 23 09:49:41 np0005626463.localdomain systemd[1]: Started Session 70 of User ceph-admin.
Feb 23 09:49:41 np0005626463.localdomain sshd[304449]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:49:41 np0005626463.localdomain sudo[304453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:41 np0005626463.localdomain sudo[304453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:41 np0005626463.localdomain sudo[304453]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:41 np0005626463.localdomain sudo[304471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:49:41 np0005626463.localdomain sudo[304471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-c4033d6ce700523fb971bc451b1396ff9a33dd68e76f08a48a0dd0cb5f2d245c-merged.mount: Deactivated successfully.
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'crash'
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 23 09:49:41 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'dashboard'
Feb 23 09:49:41 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.945+0000 7f6fbef71140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e35: np0005626466.nisqfq(active, since 1.07954s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: removing stray HostCache host record np0005626461.localdomain.devices.0
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch
Feb 23 09:49:42 np0005626463.localdomain ceph-mon[294160]: mgrmap e35: np0005626466.nisqfq(active, since 1.07954s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:49:42 np0005626463.localdomain podman[304537]: 2026-02-23 09:49:42.303917813 +0000 UTC m=+0.083886544 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:49:42 np0005626463.localdomain podman[304537]: 2026-02-23 09:49:42.339161816 +0000 UTC m=+0.119130577 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:49:42 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'devicehealth'
Feb 23 09:49:42 np0005626463.localdomain podman[304588]: 2026-02-23 09:49:42.528652843 +0000 UTC m=+0.091280029 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.538+0000 7f6fbef71140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'diskprediction_local'
Feb 23 09:49:42 np0005626463.localdomain podman[304588]: 2026-02-23 09:49:42.662416674 +0000 UTC m=+0.225043860 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, RELEASE=main)
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]:   from numpy import show_config as show_numpy_config
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.673+0000 7f6fbef71140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'influx'
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'insights'
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.731+0000 7f6fbef71140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'iostat'
Feb 23 09:49:42 np0005626463.localdomain systemd[1]: tmp-crun.b51AMX.mount: Deactivated successfully.
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 23 09:49:42 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'k8sevents'
Feb 23 09:49:42 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.845+0000 7f6fbef71140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e36: np0005626466.nisqfq(active, since 2s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'localpool'
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'mds_autoscaler'
Feb 23 09:49:43 np0005626463.localdomain sudo[304471]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'mirroring'
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:49:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:49:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:49:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:49:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:49:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:49:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'nfs'
Feb 23 09:49:43 np0005626463.localdomain sudo[304706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:43 np0005626463.localdomain sudo[304706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:43 np0005626463.localdomain sudo[304706]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:43 np0005626463.localdomain sudo[304724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:49:43 np0005626463.localdomain sudo[304724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'orchestrator'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.575+0000 7f6fbef71140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'osd_perf_query'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.718+0000 7f6fbef71140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'osd_support'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.781+0000 7f6fbef71140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'pg_autoscaler'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.837+0000 7f6fbef71140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'progress'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.903+0000 7f6fbef71140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 23 09:49:43 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'prometheus'
Feb 23 09:49:43 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.963+0000 7f6fbef71140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mgrmap e36: np0005626466.nisqfq(active, since 2s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Bus STARTING
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Serving on http://172.18.0.108:8765
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Serving on https://172.18.0.108:7150
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Bus STARTED
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Client ('172.18.0.108', 60904) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:49:44 np0005626463.localdomain sudo[304724]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rbd_support'
Feb 23 09:49:44 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.258+0000 7f6fbef71140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'restful'
Feb 23 09:49:44 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.338+0000 7f6fbef71140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain sudo[304773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:44 np0005626463.localdomain sudo[304773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:44 np0005626463.localdomain sudo[304773]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:44 np0005626463.localdomain sudo[304791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:49:44 np0005626463.localdomain sudo[304791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rgw'
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'rook'
Feb 23 09:49:44 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.655+0000 7f6fbef71140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain sudo[304791]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:49:44 np0005626463.localdomain sudo[304829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:49:44 np0005626463.localdomain sudo[304829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:44 np0005626463.localdomain sudo[304829]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain sudo[304847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'selftest'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.055+0000 7f6fbef71140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[304847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304847]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'snap_schedule'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.116+0000 7f6fbef71140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e37: np0005626466.nisqfq(active, since 4s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:45 np0005626463.localdomain sudo[304865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[304865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304865]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'stats'
Feb 23 09:49:45 np0005626463.localdomain sudo[304883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:45 np0005626463.localdomain sudo[304883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304883]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'status'
Feb 23 09:49:45 np0005626463.localdomain sudo[304901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[304901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304901]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'telegraf'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.305+0000 7f6fbef71140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'telemetry'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.362+0000 7f6fbef71140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[304935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[304935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:49:45 np0005626463.localdomain sudo[304935]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'test_orchestrator'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.491+0000 7f6fbef71140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[304959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[304959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304959]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain podman[304953]: 2026-02-23 09:49:45.504906184 +0000 UTC m=+0.078760417 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 23 09:49:45 np0005626463.localdomain podman[304953]: 2026-02-23 09:49:45.522382216 +0000 UTC m=+0.096236489 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 23 09:49:45 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:49:45 np0005626463.localdomain sudo[304992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:49:45 np0005626463.localdomain sudo[304992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[304992]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'volumes'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.638+0000 7f6fbef71140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[305010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:45 np0005626463.localdomain sudo[305010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[305010]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain sudo[305028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:45 np0005626463.localdomain sudo[305028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[305028]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: mgrmap e37: np0005626466.nisqfq(active, since 4s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:45 np0005626463.localdomain sudo[305046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[305046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[305046]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Loading python module 'zabbix'
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.837+0000 7f6fbef71140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[305064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.900+0000 7f6fbef71140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 23 09:49:45 np0005626463.localdomain sudo[305064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : Standby manager daemon np0005626463.wtksup started
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x559b75999600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Feb 23 09:49:45 np0005626463.localdomain sudo[305064]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:45 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:6810/1471406
Feb 23 09:49:45 np0005626463.localdomain sudo[305082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:45 np0005626463.localdomain sudo[305082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:45 np0005626463.localdomain sudo[305082]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305116]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:46 np0005626463.localdomain sudo[305134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305134]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:46 np0005626463.localdomain sudo[305152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.291 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:46 np0005626463.localdomain sudo[305152]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.296 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5012 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:46.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:46 np0005626463.localdomain sudo[305170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:49:46 np0005626463.localdomain sudo[305170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305170]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:49:46 np0005626463.localdomain sudo[305188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305188]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305206]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:46 np0005626463.localdomain sudo[305224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305224]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305242]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain sudo[305276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305276]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Standby manager daemon np0005626463.wtksup started
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:49:46 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:6810/1471406
Feb 23 09:49:46 np0005626463.localdomain sudo[305294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:49:46 np0005626463.localdomain sudo[305294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:46 np0005626463.localdomain sudo[305294]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e38: np0005626466.nisqfq(active, since 6s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh, np0005626463.wtksup
Feb 23 09:49:47 np0005626463.localdomain sudo[305312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:49:47 np0005626463.localdomain sudo[305312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305312]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:47 np0005626463.localdomain sudo[305330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305330]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:49:47 np0005626463.localdomain sudo[305348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305348]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:49:47 np0005626463.localdomain sudo[305366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305366]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:47 np0005626463.localdomain sudo[305384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305384]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:49:47 np0005626463.localdomain sudo[305402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305402]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain sudo[305436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:49:47 np0005626463.localdomain sudo[305436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305436]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain sudo[305454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:49:47 np0005626463.localdomain sudo[305454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305454]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:47 np0005626463.localdomain sudo[305472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:49:47 np0005626463.localdomain sudo[305472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain sudo[305472]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain sudo[305490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:49:47 np0005626463.localdomain sudo[305490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:47 np0005626463.localdomain sudo[305490]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: mgrmap e38: np0005626466.nisqfq(active, since 6s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh, np0005626463.wtksup
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:49:48 np0005626463.localdomain sudo[305508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:48 np0005626463.localdomain sudo[305508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:48 np0005626463.localdomain sudo[305508]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:48 np0005626463.localdomain sudo[305526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:48 np0005626463.localdomain sudo[305526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:49:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:49:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:49:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:49:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:49:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.701280336 +0000 UTC m=+0.083153462 container create 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:49:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope.
Feb 23 09:49:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.66758125 +0000 UTC m=+0.049454466 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.779229698 +0000 UTC m=+0.161102864 container init 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.792527603 +0000 UTC m=+0.174400769 container start 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, ceph=True, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.79309854 +0000 UTC m=+0.174971706 container attach 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Feb 23 09:49:48 np0005626463.localdomain silly_babbage[305576]: 167 167
Feb 23 09:49:48 np0005626463.localdomain systemd[1]: libpod-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope: Deactivated successfully.
Feb 23 09:49:48 np0005626463.localdomain podman[305561]: 2026-02-23 09:49:48.797325819 +0000 UTC m=+0.179199055 container died 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, version=7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:49:48 np0005626463.localdomain podman[305581]: 2026-02-23 09:49:48.896565619 +0000 UTC m=+0.090667821 container remove 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, RELEASE=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc.)
Feb 23 09:49:48 np0005626463.localdomain systemd[1]: libpod-conmon-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope: Deactivated successfully.
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)...
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:48 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain
Feb 23 09:49:48 np0005626463.localdomain sshd[305600]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:49:49 np0005626463.localdomain sudo[305526]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:49 np0005626463.localdomain sudo[305607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:49 np0005626463.localdomain sudo[305607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:49 np0005626463.localdomain sudo[305607]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:49 np0005626463.localdomain sudo[305625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:49 np0005626463.localdomain sudo[305625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:49 np0005626463.localdomain sshd[305600]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:49:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3e5dcd73db399b7bd165e48ba18e6960b89c64b2e1269e4fe0b6a9d247aa6147-merged.mount: Deactivated successfully.
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.868051526 +0000 UTC m=+0.082874353 container create 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, release=1770267347, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main)
Feb 23 09:49:49 np0005626463.localdomain systemd[1]: Started libpod-conmon-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope.
Feb 23 09:49:49 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.83436269 +0000 UTC m=+0.049185557 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.940916043 +0000 UTC m=+0.155738880 container init 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2)
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.951537807 +0000 UTC m=+0.166360644 container start 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.951835336 +0000 UTC m=+0.166658163 container attach 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, architecture=x86_64, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:49:49 np0005626463.localdomain sweet_lewin[305675]: 167 167
Feb 23 09:49:49 np0005626463.localdomain systemd[1]: libpod-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope: Deactivated successfully.
Feb 23 09:49:49 np0005626463.localdomain podman[305660]: 2026-02-23 09:49:49.956487348 +0000 UTC m=+0.171310205 container died 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1770267347, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 09:49:50 np0005626463.localdomain podman[305680]: 2026-02-23 09:49:50.096006804 +0000 UTC m=+0.089111013 container remove 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7, architecture=x86_64)
Feb 23 09:49:50 np0005626463.localdomain systemd[1]: libpod-conmon-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope: Deactivated successfully.
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)...
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain
Feb 23 09:49:50 np0005626463.localdomain sudo[305625]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:50 np0005626463.localdomain sudo[305696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:49:50 np0005626463.localdomain sudo[305696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:50 np0005626463.localdomain sudo[305696]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:50 np0005626463.localdomain sudo[305714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:49:50 np0005626463.localdomain sudo[305714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:49:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2a3256188aa71ccd5d56dce7a4a0ab2c4e98074d8d862e0e18879f63fd382dac-merged.mount: Deactivated successfully.
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.868902607 +0000 UTC m=+0.076478379 container create 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, name=rhceph, GIT_BRANCH=main)
Feb 23 09:49:50 np0005626463.localdomain systemd[1]: Started libpod-conmon-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope.
Feb 23 09:49:50 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.843190595 +0000 UTC m=+0.050766467 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.95115651 +0000 UTC m=+0.158732292 container init 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph)
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.96331845 +0000 UTC m=+0.170894222 container start 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, GIT_BRANCH=main, release=1770267347, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2)
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.964069434 +0000 UTC m=+0.171645236 container attach 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, ceph=True, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public)
Feb 23 09:49:50 np0005626463.localdomain serene_agnesi[305765]: 167 167
Feb 23 09:49:50 np0005626463.localdomain systemd[1]: libpod-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope: Deactivated successfully.
Feb 23 09:49:50 np0005626463.localdomain podman[305750]: 2026-02-23 09:49:50.969178289 +0000 UTC m=+0.176754131 container died 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:49:51 np0005626463.localdomain podman[305770]: 2026-02-23 09:49:51.071687299 +0000 UTC m=+0.089576388 container remove 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 23 09:49:51 np0005626463.localdomain systemd[1]: libpod-conmon-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope: Deactivated successfully.
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain sudo[305714]: pam_unix(sudo:session): session closed for user root
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)...
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:51.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:51 np0005626463.localdomain sshd[305787]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:49:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6a3b81da0c89bdf3142d4296e702090d78b44bc3007472fbd1e2db96cf492f8c-merged.mount: Deactivated successfully.
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)...
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)...
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain
Feb 23 09:49:53 np0005626463.localdomain ceph-mon[294160]: from='client.64136 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:49:53 np0005626463.localdomain sshd[305787]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)...
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 23 09:49:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:49:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:49:55 np0005626463.localdomain podman[305790]: 2026-02-23 09:49:55.920469781 +0000 UTC m=+0.087970048 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:49:55 np0005626463.localdomain podman[305790]: 2026-02-23 09:49:55.935256771 +0000 UTC m=+0.102757008 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:49:55 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:49:56 np0005626463.localdomain podman[305789]: 2026-02-23 09:49:56.025126946 +0000 UTC m=+0.192874031 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain podman[305789]: 2026-02-23 09:49:56.115767486 +0000 UTC m=+0.283514591 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:56 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)...
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "mgr services"} : dispatch
Feb 23 09:49:56 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.359 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.362 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:49:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:49:56.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)...
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)...
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: from='client.54184 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: Saving service mon spec with placement label:mon
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)...
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:49:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:49:59 np0005626463.localdomain podman[305838]: 2026-02-23 09:49:59.924497373 +0000 UTC m=+0.101875061 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:49:59 np0005626463.localdomain podman[305838]: 2026-02-23 09:49:59.935015043 +0000 UTC m=+0.112392761 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:49:59 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005626461.lrfquh on host np0005626461.localdomain not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [WRN] :     stray host np0005626461.localdomain has 1 stray daemons: ['mgr.np0005626461.lrfquh']
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: from='client.54190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626466", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]:     stray daemon mgr.np0005626461.lrfquh on host np0005626461.localdomain not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]:     stray host np0005626461.localdomain has 1 stray daemons: ['mgr.np0005626461.lrfquh']
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:50:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.366 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.369 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.392 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:01.393 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)...
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:01 np0005626463.localdomain sudo[305856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:50:01 np0005626463.localdomain sudo[305856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:50:01 np0005626463.localdomain sudo[305856]: pam_unix(sudo:session): session closed for user root
Feb 23 09:50:01 np0005626463.localdomain sudo[305874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:50:01 np0005626463.localdomain sudo[305874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:50:01 np0005626463.localdomain sudo[305874]: pam_unix(sudo:session): session closed for user root
Feb 23 09:50:01 np0005626463.localdomain sudo[305892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:50:01 np0005626463.localdomain sudo[305892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.431607397 +0000 UTC m=+0.083866324 container create 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:50:02 np0005626463.localdomain systemd[1]: Started libpod-conmon-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope.
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.398743296 +0000 UTC m=+0.051002263 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 09:50:02 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.516967185 +0000 UTC m=+0.169226112 container init 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, ceph=True, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64)
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.562533612 +0000 UTC m=+0.214792539 container start 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, release=1770267347, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.563050197 +0000 UTC m=+0.215309174 container attach 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, architecture=x86_64, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:50:02 np0005626463.localdomain jovial_wilson[305943]: 167 167
Feb 23 09:50:02 np0005626463.localdomain systemd[1]: libpod-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope: Deactivated successfully.
Feb 23 09:50:02 np0005626463.localdomain podman[305928]: 2026-02-23 09:50:02.566534693 +0000 UTC m=+0.218793620 container died 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_BRANCH=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 09:50:02 np0005626463.localdomain podman[305948]: 2026-02-23 09:50:02.674784818 +0000 UTC m=+0.091691952 container remove 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Feb 23 09:50:02 np0005626463.localdomain systemd[1]: libpod-conmon-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope: Deactivated successfully.
Feb 23 09:50:02 np0005626463.localdomain sudo[305892]: pam_unix(sudo:session): session closed for user root
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:50:02 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:50:03 np0005626463.localdomain podman[305965]: 2026-02-23 09:50:03.419844205 +0000 UTC m=+0.091861417 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:50:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-87234b8af18ff0dfc55697c9b7c7ba4949dd82605185e6b8db0eda3121ff4a5a-merged.mount: Deactivated successfully.
Feb 23 09:50:03 np0005626463.localdomain podman[305965]: 2026-02-23 09:50:03.455286254 +0000 UTC m=+0.127303456 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible)
Feb 23 09:50:03 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)...
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:50:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)...
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1366015865' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:50:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1366015865' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:50:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:50:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e39: np0005626466.nisqfq(active, since 25s), standbys: np0005626465.hlpkwo, np0005626463.wtksup
Feb 23 09:50:06 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.394 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:06.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:07 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:07 np0005626463.localdomain ceph-mon[294160]: mgrmap e39: np0005626466.nisqfq(active, since 25s), standbys: np0005626465.hlpkwo, np0005626463.wtksup
Feb 23 09:50:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:50:09 np0005626463.localdomain ceph-mon[294160]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:50:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:50:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:50:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:50:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:50:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18291 "" "Go-http-client/1.1"
Feb 23 09:50:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:11 np0005626463.localdomain ceph-mon[294160]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.402 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:11.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:50:12 np0005626463.localdomain podman[305984]: 2026-02-23 09:50:12.915776172 +0000 UTC m=+0.091324761 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:50:12 np0005626463.localdomain podman[305984]: 2026-02-23 09:50:12.934205082 +0000 UTC m=+0.109753681 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:50:12 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:50:13 np0005626463.localdomain ceph-mon[294160]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:50:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:50:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:50:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:50:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:50:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:50:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1942676297' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2417706239' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1873174007' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3122146212' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.913 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.913 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.914 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:50:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:14.914 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:50:15 np0005626463.localdomain ceph-mon[294160]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:15.458 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:50:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:15.478 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:50:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:15.478 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:50:15 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:50:15 np0005626463.localdomain podman[306006]: 2026-02-23 09:50:15.918403847 +0000 UTC m=+0.080894004 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7)
Feb 23 09:50:15 np0005626463.localdomain podman[306006]: 2026-02-23 09:50:15.957655391 +0000 UTC m=+0.120145558 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, release=1770267347, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible)
Feb 23 09:50:15 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.430 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:16.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:17 np0005626463.localdomain ceph-mon[294160]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:19.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:19.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:19.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:50:19 np0005626463.localdomain ceph-mon[294160]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:20.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:20.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:21 np0005626463.localdomain ceph-mon[294160]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.466 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.469 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:21.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:50:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:50:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1980972140' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.530 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.590 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.590 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.834 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.836 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11733MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.836 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.837 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.949 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.949 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.950 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:50:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:22.999 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:50:23 np0005626463.localdomain ceph-mon[294160]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1980972140' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:50:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2213743575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:23.454 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:50:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:23.461 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:50:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:23.479 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:50:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:23.481 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:50:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:23.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:50:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2213743575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:50:25 np0005626463.localdomain ceph-mon[294160]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.535 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.538 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:26.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:50:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:50:26 np0005626463.localdomain podman[306070]: 2026-02-23 09:50:26.916680118 +0000 UTC m=+0.086299108 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216)
Feb 23 09:50:26 np0005626463.localdomain podman[306070]: 2026-02-23 09:50:26.958305604 +0000 UTC m=+0.127924624 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:50:26 np0005626463.localdomain podman[306071]: 2026-02-23 09:50:26.967929537 +0000 UTC m=+0.134331859 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:50:26 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:50:26 np0005626463.localdomain podman[306071]: 2026-02-23 09:50:26.981297534 +0000 UTC m=+0.147699896 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:50:26 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:50:27 np0005626463.localdomain ceph-mon[294160]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:29 np0005626463.localdomain sshd[306118]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:50:29 np0005626463.localdomain ceph-mon[294160]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:30 np0005626463.localdomain sshd[306118]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:50:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:50:30 np0005626463.localdomain podman[306120]: 2026-02-23 09:50:30.423187438 +0000 UTC m=+0.083181151 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:50:30 np0005626463.localdomain podman[306120]: 2026-02-23 09:50:30.431958705 +0000 UTC m=+0.091952378 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 09:50:30 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:50:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:31 np0005626463.localdomain ceph-mon[294160]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:31.580 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:33 np0005626463.localdomain ceph-mon[294160]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:50:33 np0005626463.localdomain podman[306139]: 2026-02-23 09:50:33.911324019 +0000 UTC m=+0.086839754 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:50:33 np0005626463.localdomain podman[306139]: 2026-02-23 09:50:33.917209318 +0000 UTC m=+0.092725093 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:50:33 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:50:35 np0005626463.localdomain ceph-mon[294160]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.581 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.583 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.604 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:36.605 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:37 np0005626463.localdomain ceph-mon[294160]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2126719983' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 23 09:50:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:50:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:50:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:50:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:50:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:50:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18290 "" "Go-http-client/1.1"
Feb 23 09:50:39 np0005626463.localdomain ceph-mon[294160]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:40 np0005626463.localdomain sshd[306157]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:50:40 np0005626463.localdomain sshd[306157]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:50:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:41 np0005626463.localdomain ceph-mon[294160]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.606 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.608 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:41.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:42 np0005626463.localdomain ceph-mon[294160]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:50:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:50:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:50:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:50:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:50:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:50:43 np0005626463.localdomain ceph-mon[294160]: from='client.64169 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:50:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:50:43 np0005626463.localdomain podman[306159]: 2026-02-23 09:50:43.913587459 +0000 UTC m=+0.086317127 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:50:43 np0005626463.localdomain podman[306159]: 2026-02-23 09:50:43.922783351 +0000 UTC m=+0.095513009 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:50:43 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:50:44 np0005626463.localdomain ceph-mon[294160]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.645 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.648 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.678 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:46.678 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:46 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:50:46 np0005626463.localdomain podman[306184]: 2026-02-23 09:50:46.902280449 +0000 UTC m=+0.078454227 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347)
Feb 23 09:50:46 np0005626463.localdomain podman[306184]: 2026-02-23 09:50:46.914619557 +0000 UTC m=+0.090793335 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:50:46 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:50:47 np0005626463.localdomain ceph-mon[294160]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:50:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:50:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:50:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:50:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:50:48.554 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:50:49 np0005626463.localdomain ceph-mon[294160]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:50 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/4257872446' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Feb 23 09:50:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:51 np0005626463.localdomain ceph-mon[294160]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.679 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.680 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.681 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.681 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:51.710 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:53 np0005626463.localdomain ceph-mon[294160]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:55 np0005626463.localdomain ceph-mon[294160]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.165 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.166 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a58282-47aa-44ed-b791-edcbd7927f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.135412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '262e164a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': 'b14f6174a7b1924bed0645e35c409950639554c73b168f152408d47a44cf3844'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.135412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '262e2cac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '8a127751fc54542f2917f4a9b25bf66d1b1d39d69bf061b9bb3c1623ecc3bffc'}]}, 'timestamp': '2026-02-23 09:50:56.166856', '_unique_id': '72c79dc885e24a5aaf75dcac365b7922'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.174 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a8d1c5f-c9fa-4a00-b812-701a44393794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.170095', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '262f5e6a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': 'cfc02a11302be6d047ec3753f003dea55d030c502fc4d41f9ebc5e9a6a691ce4'}]}, 'timestamp': '2026-02-23 09:50:56.174738', '_unique_id': '88d64e77d50a464ba82072cacb5a41f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '225f574e-6c26-45b7-a249-66da55fe33ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.177130', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '262fd0c0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '2b007ab4dfc8f541ba8e5e342ad9a29c9cea6cb1379f463e684d0c43557ebfed'}]}, 'timestamp': '2026-02-23 09:50:56.177646', '_unique_id': 'd4990bc6fa3e4c68a7362f1e368bd380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3585f430-e7ec-4090-a413-13fbae5e0f99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.179985', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263040d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '801ea1908d7040b77258fa5fb5bc52678c398b09b16ba818d643f12f1b845340'}]}, 'timestamp': '2026-02-23 09:50:56.180486', '_unique_id': '6011db3ad72f437f911c8c25d1c63566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd461f23e-13b9-43df-95fd-faf8f397c2f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.182808', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2630b04e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '4b22dd8aa348e8cadd611bd1e538eca9487624f41c29b7e49c5707f7227f316a'}]}, 'timestamp': '2026-02-23 09:50:56.183343', '_unique_id': '639fedd46f9e43a4a6ca107155509544'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.185 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49ca14c7-74c3-4480-bf6a-29d2e614478b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.185639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '26311e9e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '449256359c61936e385b3f2bea67f2aeda4ba4d7eb044f91b06182d6beb220c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.185639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2631300a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '353fff40f293e529d86f1b233e7af49d383540388187fee0531c058f9f0f1703'}]}, 'timestamp': '2026-02-23 09:50:56.186609', '_unique_id': 'd315b3bf19a24c2b906f360f926723da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65e4d908-262b-49cc-9b6b-c5d122e22dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.189276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2631aada-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '70ce45cb8f3f26b262933f8ef52c987a62ba5b56467a1e09b3c72220723cf623'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.189276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2631bdc2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '9e1b7934d1d5d97b9c0153498eec1b78bfdd57c4c8d5c5d461d27cf9e01147e1'}]}, 'timestamp': '2026-02-23 09:50:56.190216', '_unique_id': '1409330b51b9451eb03e38e8d776ee8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a2f6191-9e06-4597-ada2-2798f245fe0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.192601', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '26322cf8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': 'b77c788dc167fbc23cd30562abb86ac7073abea1530daefe43da56a4a99e956f'}]}, 'timestamp': '2026-02-23 09:50:56.193146', '_unique_id': 'ba3b317f3998401bb66dbdcd056e691f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81cb3e69-bd4f-4f91-b9f8-d6434dc8c596', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.195618', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2632a2fa-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '7d887917eb4338632be4343abc17297012526ef3db75beab79b8dcb8caa288ad'}]}, 'timestamp': '2026-02-23 09:50:56.196152', '_unique_id': '3117ca2c405a4788b0bbb9377620481e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd37ce054-8812-410e-ada7-49e55e58a15e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.198469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263314c4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '441594391873d8ce07afad15de5693bb22a3a437a302cd991149b625691c1401'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.198469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2633287e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': 'cc8e2cd3ca226d4af2d525c8d52768b0b4505fbb87cb533f2b8a538a85ee830b'}]}, 'timestamp': '2026-02-23 09:50:56.199507', '_unique_id': 'f584abf51f81444f8dbd3c87f59f97fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb80aaf-ea99-4654-9cb4-d0c177e2795d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.201815', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2633967e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '3dca02fb5375d9fa62ce8cba61c27ced8bb17d9449990f7fb3cd0e55421a7f06'}]}, 'timestamp': '2026-02-23 09:50:56.202338', '_unique_id': '6e1d930d2a6d4d36b088edb3c837814f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '133bacfd-f0ec-4760-89f5-cbf32f75c347', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.204818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2635e7a8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '08a640b5173a8b4a2447333a6340f4f87e54b3a0d0030811b1a2aa6025de3211'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.204818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2635f8ec-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '0b9edb8f941872fe2a7b0751d590e62b07e85451a6e2e96c86526f5b4cb5b596'}]}, 'timestamp': '2026-02-23 09:50:56.217990', '_unique_id': 'ca729a09df8f41c7a300e719d86bc2d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4b95441-09a7-4159-a11e-b3f5eb97c112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.220332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263667d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': 'cea25f0e81f0ab9b292a597f1402d3d7db656bca0c90616b713e05dd91299cf4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.220332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '26367b46-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '504c07465faef051b050869bf5e3c216557ca99881c6156bb37ad660ebc4a5b8'}]}, 'timestamp': '2026-02-23 09:50:56.221279', '_unique_id': '59fa1e86130b4462b30e43f68813798f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55c09baa-55e2-4ce6-aa05-4b0d4b6b957e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.223721', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2636edb0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '790bed47a7d2adc94d10230301dafe6c5876761dfb27dfa5646ac04665f2ae13'}]}, 'timestamp': '2026-02-23 09:50:56.224254', '_unique_id': 'f243107820434f849e720aa28a8e137b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99e48839-bf61-4381-bc0b-3ffe759f4e5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.226471', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263758fe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '0f3d3ee9ff0aed03809053aea663e5c11c90de7c025e5fbc2d179560d19a63bc'}]}, 'timestamp': '2026-02-23 09:50:56.227011', '_unique_id': '1d318e8857f64565aa1d2968c175c959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '287377ef-0c12-46f2-9023-a00898ebb660', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:50:56.229253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '263a531a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.435351571, 'message_signature': 'd87e52e61b7c09ee926ab0bc5fb37b0208e3765fa0a78fd33c2c68c833a63093'}]}, 'timestamp': '2026-02-23 09:50:56.246476', '_unique_id': '55daba01d8464c8081fc936f39c078ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d00a9a2-45c9-4367-98ed-669ae1acd688', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.248863', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263ac41c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '95c8977cda96e8a4094e5059a9d390eb0a663b6c9771f8fbb1d05655878ba538'}]}, 'timestamp': '2026-02-23 09:50:56.249379', '_unique_id': 'e507f66dc4bb4f40952b70d6c182238e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b5266e5-9f20-4c9b-a9c8-76de0e25ae40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.251656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263b3136-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '57c50762be8981feff221bba8e300d5d2fd15be6031cc35a1d8a28f355566f05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.251656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263b427a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '6b8d60b837695dd5e6ca7e9249dabce01d59bdaf9cce94df23ca9cd715a083c5'}]}, 'timestamp': '2026-02-23 09:50:56.252583', '_unique_id': '3966151dbd904fdfbf99c0fca8b945bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 12010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4261036-a3bc-4ea7-86a0-0f39378a823c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12010000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:50:56.254939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '263bb0ac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.435351571, 'message_signature': '1a5ba7625224179ca8a70c0f2add09c4300778f144d1eb4cf197015e89a460f6'}]}, 'timestamp': '2026-02-23 09:50:56.255422', '_unique_id': '134f432099aa4457a413ae858f0351e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c1cdcb3-31da-4cb5-8802-b961dde47dff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.257725', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263c1d94-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '355d574f8780627a375794b8a15cfa0e3fe3d00c44cb5b2d87dccf66ed4ed59d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.257725', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263c2fa0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '4dadbf8684b2996d70d98bf04e1b07f49df49030225d901a940f8472f9d0144a'}]}, 'timestamp': '2026-02-23 09:50:56.258683', '_unique_id': '8ab393d45687486eb4eda2a926dbf40d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a958edb9-ba3f-4d96-b8a0-8c2d2ee9ab3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.261012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263c9d96-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '5631222fe4bd04d006772f3c006389eddfa2850c325a3ed390008e5fe8d68785'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.261012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263cae6c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '68c92ecb5f75194ebc39bfff1ed149b6de0926d34ff3ec8c3533251feb7840d0'}]}, 'timestamp': '2026-02-23 09:50:56.261933', '_unique_id': 'f965329f0fa041a98b0ad1dbb04536e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:50:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.710 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:50:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:50:56.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:50:57 np0005626463.localdomain ceph-mon[294160]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:50:57 np0005626463.localdomain ceph-mon[294160]: from='client.54241 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 23 09:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:50:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:50:57 np0005626463.localdomain systemd[1]: tmp-crun.BDjUDM.mount: Deactivated successfully.
Feb 23 09:50:57 np0005626463.localdomain podman[306203]: 2026-02-23 09:50:57.922995747 +0000 UTC m=+0.094599731 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:50:58 np0005626463.localdomain podman[306204]: 2026-02-23 09:50:58.009045905 +0000 UTC m=+0.175237533 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:50:58 np0005626463.localdomain podman[306204]: 2026-02-23 09:50:58.02126236 +0000 UTC m=+0.187453968 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:50:58 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:50:58 np0005626463.localdomain podman[306203]: 2026-02-23 09:50:58.062942778 +0000 UTC m=+0.234546732 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0)
Feb 23 09:50:58 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:50:58 np0005626463.localdomain systemd[1]: tmp-crun.Q1n8DA.mount: Deactivated successfully.
Feb 23 09:50:59 np0005626463.localdomain ceph-mon[294160]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:00 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:51:00 np0005626463.localdomain podman[306251]: 2026-02-23 09:51:00.908852639 +0000 UTC m=+0.083346656 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Feb 23 09:51:00 np0005626463.localdomain podman[306251]: 2026-02-23 09:51:00.921248239 +0000 UTC m=+0.095742236 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Feb 23 09:51:00 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:51:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:01 np0005626463.localdomain ceph-mon[294160]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.790 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:01.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:03 np0005626463.localdomain sshd[306270]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:51:03 np0005626463.localdomain ceph-mon[294160]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:03 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/3625719757' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:51:03 np0005626463.localdomain sudo[306272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:51:03 np0005626463.localdomain sudo[306272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:03 np0005626463.localdomain sudo[306272]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:03 np0005626463.localdomain sudo[306290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:51:03 np0005626463.localdomain sudo[306290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4258538050' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4258538050' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:51:04 np0005626463.localdomain sudo[306290]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:51:04 np0005626463.localdomain sudo[306341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:51:04 np0005626463.localdomain sudo[306341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:51:04 np0005626463.localdomain sudo[306341]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:04 np0005626463.localdomain podman[306359]: 2026-02-23 09:51:04.795937861 +0000 UTC m=+0.086903545 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:51:04 np0005626463.localdomain podman[306359]: 2026-02-23 09:51:04.802528873 +0000 UTC m=+0.093494557 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:51:04 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 do_prune osdmap full prune enabled
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Activating manager daemon np0005626465.hlpkwo
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 e86: 6 total, 6 up, 6 in
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e86: 6 total, 6 up, 6 in
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:51:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e40: np0005626465.hlpkwo(active, starting, since 0.0333241s), standbys: np0005626463.wtksup
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Manager daemon np0005626465.hlpkwo is now available
Feb 23 09:51:05 np0005626463.localdomain sshd[304449]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 23 09:51:05 np0005626463.localdomain systemd-logind[759]: Session 70 logged out. Waiting for processes to exit.
Feb 23 09:51:05 np0005626463.localdomain systemd[1]: session-70.scope: Deactivated successfully.
Feb 23 09:51:05 np0005626463.localdomain systemd[1]: session-70.scope: Consumed 10.431s CPU time.
Feb 23 09:51:05 np0005626463.localdomain systemd-logind[759]: Removed session 70.
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} v 0)
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} v 0)
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain sshd[306377]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:51:05 np0005626463.localdomain sshd[306377]: Accepted publickey for ceph-admin from 192.168.122.107 port 52926 ssh2: RSA SHA256:Xa/VMkXtB77nHz5d33Gpc1SPjvrShbbTtqHwAtI7vJo
Feb 23 09:51:05 np0005626463.localdomain systemd-logind[759]: New session 71 of user ceph-admin.
Feb 23 09:51:05 np0005626463.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Feb 23 09:51:05 np0005626463.localdomain sshd[306377]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 23 09:51:05 np0005626463.localdomain sudo[306381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:51:05 np0005626463.localdomain sudo[306381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:05 np0005626463.localdomain sudo[306381]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' 
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.200:0/2649255566' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: Activating manager daemon np0005626465.hlpkwo
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: osdmap e86: 6 total, 6 up, 6 in
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: mgrmap e40: np0005626465.hlpkwo(active, starting, since 0.0333241s), standbys: np0005626463.wtksup
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mds metadata"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon metadata"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: Manager daemon np0005626465.hlpkwo is now available
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch
Feb 23 09:51:05 np0005626463.localdomain sudo[306399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 09:51:05 np0005626463.localdomain sudo[306399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:05 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e41: np0005626465.hlpkwo(active, since 1.05196s), standbys: np0005626463.wtksup
Feb 23 09:51:06 np0005626463.localdomain sshd[306270]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:51:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:06 np0005626463.localdomain podman[306488]: 2026-02-23 09:51:06.492167105 +0000 UTC m=+0.099676517 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph)
Feb 23 09:51:06 np0005626463.localdomain podman[306488]: 2026-02-23 09:51:06.625578415 +0000 UTC m=+0.233087837 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=)
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.794 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.795 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.795 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.846 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:06.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 23 09:51:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 23 09:51:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : Cluster is now healthy
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mgrmap e41: np0005626465.hlpkwo(active, since 1.05196s), standbys: np0005626463.wtksup
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e42: np0005626465.hlpkwo(active, since 2s), standbys: np0005626463.wtksup
Feb 23 09:51:07 np0005626463.localdomain sudo[306399]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain sudo[306604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:51:07 np0005626463.localdomain sudo[306604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:07 np0005626463.localdomain sudo[306604]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:51:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:07 np0005626463.localdomain sudo[306622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:51:07 np0005626463.localdomain sudo[306622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:08 np0005626463.localdomain sudo[306622]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Bus STARTING
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Serving on http://172.18.0.107:8765
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Serving on https://172.18.0.107:7150
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Bus STARTED
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Client ('172.18.0.107', 34908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: Cluster is now healthy
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mgrmap e42: np0005626465.hlpkwo(active, since 2s), standbys: np0005626463.wtksup
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain sudo[306672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:51:08 np0005626463.localdomain sudo[306672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:08 np0005626463.localdomain sudo[306672]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:08 np0005626463.localdomain sudo[306690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 23 09:51:08 np0005626463.localdomain sudo[306690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:51:08 np0005626463.localdomain sudo[306690]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 09:51:09 np0005626463.localdomain sudo[306728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:51:09 np0005626463.localdomain sudo[306728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306728]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:51:09 np0005626463.localdomain sudo[306746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306746]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:51:09 np0005626463.localdomain sudo[306764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306764]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:51:09 np0005626463.localdomain sudo[306782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306782]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:51:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:51:09 np0005626463.localdomain sudo[306800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:51:09 np0005626463.localdomain sudo[306800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306800]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:51:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:51:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:51:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18284 "" "Go-http-client/1.1"
Feb 23 09:51:09 np0005626463.localdomain sudo[306834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:51:09 np0005626463.localdomain sudo[306834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306834]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e43: np0005626465.hlpkwo(active, since 4s), standbys: np0005626463.wtksup
Feb 23 09:51:09 np0005626463.localdomain sudo[306852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new
Feb 23 09:51:09 np0005626463.localdomain sudo[306852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306852]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 23 09:51:09 np0005626463.localdomain sudo[306870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306870]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:51:09 np0005626463.localdomain sudo[306888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306888]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:51:09 np0005626463.localdomain sudo[306906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306906]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain sudo[306924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:51:09 np0005626463.localdomain sudo[306924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:09 np0005626463.localdomain sudo[306924]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:09 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : Standby manager daemon np0005626466.nisqfq started
Feb 23 09:51:10 np0005626463.localdomain sudo[306942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:51:10 np0005626463.localdomain sudo[306942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[306942]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[306960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:51:10 np0005626463.localdomain sudo[306960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[306960]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[306994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:51:10 np0005626463.localdomain sudo[306994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[306994]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new
Feb 23 09:51:10 np0005626463.localdomain sudo[307012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307012]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:51:10 np0005626463.localdomain sudo[307030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307030]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 23 09:51:10 np0005626463.localdomain sudo[307048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307048]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph
Feb 23 09:51:10 np0005626463.localdomain sudo[307066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307066]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: mgrmap e43: np0005626465.hlpkwo(active, since 4s), standbys: np0005626463.wtksup
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: Standby manager daemon np0005626466.nisqfq started
Feb 23 09:51:10 np0005626463.localdomain sudo[307084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:51:10 np0005626463.localdomain sudo[307084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307084]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e44: np0005626465.hlpkwo(active, since 5s), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:51:10 np0005626463.localdomain sudo[307102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:51:10 np0005626463.localdomain sudo[307102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307102]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:51:10 np0005626463.localdomain sudo[307120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307120]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:51:10 np0005626463.localdomain sudo[307154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:10 np0005626463.localdomain sudo[307154]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:10 np0005626463.localdomain sudo[307172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new
Feb 23 09:51:11 np0005626463.localdomain sudo[307172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307172]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain sudo[307190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307190]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:51:11 np0005626463.localdomain sudo[307208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307208]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:11 np0005626463.localdomain sudo[307226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config
Feb 23 09:51:11 np0005626463.localdomain sudo[307226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307226]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:51:11 np0005626463.localdomain sudo[307244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307244]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46
Feb 23 09:51:11 np0005626463.localdomain sudo[307262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307262]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:51:11 np0005626463.localdomain sudo[307280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307280]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:51:11 np0005626463.localdomain sudo[307314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307314]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mgrmap e44: np0005626465.hlpkwo(active, since 5s), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain sudo[307332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new
Feb 23 09:51:11 np0005626463.localdomain sudo[307332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307332]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain sudo[307350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f1fea371-cb69-578d-a3d0-b5c472a84b46/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring.new /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring
Feb 23 09:51:11 np0005626463.localdomain sudo[307350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307350]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.849 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.849 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.850 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:51:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:11 np0005626463.localdomain sudo[307368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:51:11 np0005626463.localdomain sudo[307368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:11 np0005626463.localdomain sudo[307368]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain sudo[307386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:51:12 np0005626463.localdomain sudo[307386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:51:12 np0005626463.localdomain sudo[307386]: pam_unix(sudo:session): session closed for user root
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:51:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:51:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:51:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:51:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:51:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:51:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:51:13 np0005626463.localdomain ceph-mon[294160]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Feb 23 09:51:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/476270285' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:51:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3609414126' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:14 np0005626463.localdomain podman[307404]: 2026-02-23 09:51:14.910424814 +0000 UTC m=+0.085347368 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:51:14 np0005626463.localdomain podman[307404]: 2026-02-23 09:51:14.924366742 +0000 UTC m=+0.099289286 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:51:14 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:51:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:51:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:16 np0005626463.localdomain ceph-mon[294160]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 23 09:51:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:51:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1278054367' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:16.887 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3823247606' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.483 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.484 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.485 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.582 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.583 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.583 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:51:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:17.584 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:51:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:51:17 np0005626463.localdomain systemd[1]: tmp-crun.CMYvMQ.mount: Deactivated successfully.
Feb 23 09:51:17 np0005626463.localdomain podman[307427]: 2026-02-23 09:51:17.907980645 +0000 UTC m=+0.084279005 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., version=9.7, release=1770267347, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:51:17 np0005626463.localdomain podman[307427]: 2026-02-23 09:51:17.924204412 +0000 UTC m=+0.100502762 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc.)
Feb 23 09:51:17 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:51:18 np0005626463.localdomain ceph-mon[294160]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 23 09:51:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:18.295 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:51:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:18.314 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:51:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:18.314 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:51:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:18.315 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:19.882 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:20.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:51:20 np0005626463.localdomain ceph-mon[294160]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:51:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:21.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:21.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:21.889 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:22 np0005626463.localdomain ceph-mon[294160]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:51:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.078 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:51:24 np0005626463.localdomain ceph-mon[294160]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 23 09:51:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:51:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3454500488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.552 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.608 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.609 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.809 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.811 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11743MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.811 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.812 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.876 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:51:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:24.918 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:51:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3454500488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:25 np0005626463.localdomain ceph-mon[294160]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:51:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1539052837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:25.350 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:51:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:25.356 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:51:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:25.389 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:51:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:25.392 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:51:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:25.393 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:51:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1539052837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:51:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:26.892 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.278362) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287278441, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2447, "num_deletes": 256, "total_data_size": 6625461, "memory_usage": 6807144, "flush_reason": "Manual Compaction"}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287312148, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 6126102, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20075, "largest_seqno": 22521, "table_properties": {"data_size": 6115243, "index_size": 6788, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26255, "raw_average_key_size": 22, "raw_value_size": 6092245, "raw_average_value_size": 5145, "num_data_blocks": 292, "num_entries": 1184, "num_filter_entries": 1184, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840180, "oldest_key_time": 1771840180, "file_creation_time": 1771840287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 33841 microseconds, and 14261 cpu microseconds.
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.312210) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 6126102 bytes OK
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.312239) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314457) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314480) EVENT_LOG_v1 {"time_micros": 1771840287314474, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314503) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 6614376, prev total WAL file size 6614376, number of live WAL files 2.
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.315836) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(5982KB)], [33(15MB)]
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287315915, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 22457026, "oldest_snapshot_seqno": -1}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12283 keys, 19493283 bytes, temperature: kUnknown
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287444935, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 19493283, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19420243, "index_size": 41259, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30725, "raw_key_size": 327917, "raw_average_key_size": 26, "raw_value_size": 19207895, "raw_average_value_size": 1563, "num_data_blocks": 1588, "num_entries": 12283, "num_filter_entries": 12283, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.445203) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 19493283 bytes
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.446960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.0 rd, 151.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.8, 15.6 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(6.8) write-amplify(3.2) OK, records in: 12823, records dropped: 540 output_compression: NoCompression
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.446991) EVENT_LOG_v1 {"time_micros": 1771840287446979, "job": 18, "event": "compaction_finished", "compaction_time_micros": 129074, "compaction_time_cpu_micros": 53367, "output_level": 6, "num_output_files": 1, "total_output_size": 19493283, "num_input_records": 12823, "num_output_records": 12283, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287448031, "job": 18, "event": "table_file_deletion", "file_number": 35}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287450458, "job": 18, "event": "table_file_deletion", "file_number": 33}
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.315701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450562) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:27 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:51:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:51:28 np0005626463.localdomain podman[307492]: 2026-02-23 09:51:28.912056382 +0000 UTC m=+0.085321477 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 09:51:28 np0005626463.localdomain podman[307493]: 2026-02-23 09:51:28.965026156 +0000 UTC m=+0.134987000 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:51:28 np0005626463.localdomain podman[307493]: 2026-02-23 09:51:28.976474807 +0000 UTC m=+0.146435661 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:51:28 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:51:29 np0005626463.localdomain podman[307492]: 2026-02-23 09:51:29.027022097 +0000 UTC m=+0.200287182 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 23 09:51:29 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:51:30 np0005626463.localdomain ceph-mon[294160]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:51:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:31.894 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:51:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:31.897 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:31 np0005626463.localdomain podman[307544]: 2026-02-23 09:51:31.909992543 +0000 UTC m=+0.085233204 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:51:31 np0005626463.localdomain podman[307544]: 2026-02-23 09:51:31.920994861 +0000 UTC m=+0.096235472 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:51:31 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:51:32 np0005626463.localdomain sshd[307563]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:51:32 np0005626463.localdomain ceph-mon[294160]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:32 np0005626463.localdomain sshd[307563]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:51:34 np0005626463.localdomain ceph-mon[294160]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:51:35 np0005626463.localdomain podman[307565]: 2026-02-23 09:51:35.910654597 +0000 UTC m=+0.084924784 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:51:35 np0005626463.localdomain podman[307565]: 2026-02-23 09:51:35.94303004 +0000 UTC m=+0.117300237 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:51:35 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.243151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296243187, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 323, "num_deletes": 251, "total_data_size": 88314, "memory_usage": 95160, "flush_reason": "Manual Compaction"}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296249717, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 86555, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22522, "largest_seqno": 22844, "table_properties": {"data_size": 84483, "index_size": 247, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5806, "raw_average_key_size": 20, "raw_value_size": 80359, "raw_average_value_size": 280, "num_data_blocks": 11, "num_entries": 286, "num_filter_entries": 286, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840289, "oldest_key_time": 1771840289, "file_creation_time": 1771840296, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 6608 microseconds, and 1085 cpu microseconds.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.249760) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 86555 bytes OK
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.249779) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251721) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251747) EVENT_LOG_v1 {"time_micros": 1771840296251741, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251767) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 86066, prev total WAL file size 86390, number of live WAL files 2.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.252606) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373532' seq:72057594037927935, type:22 .. '6D6772737461740034303034' seq:0, type:0; will stop at (end)
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(84KB)], [36(18MB)]
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296252648, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19579838, "oldest_snapshot_seqno": -1}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12054 keys, 17350766 bytes, temperature: kUnknown
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296363859, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 17350766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17284081, "index_size": 35480, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323351, "raw_average_key_size": 26, "raw_value_size": 17080516, "raw_average_value_size": 1416, "num_data_blocks": 1346, "num_entries": 12054, "num_filter_entries": 12054, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840296, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.364243) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 17350766 bytes
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.365967) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.8 rd, 155.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(16.5 +0.0 blob), read-write-amplify(426.7) write-amplify(200.5) OK, records in: 12569, records dropped: 515 output_compression: NoCompression
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.365999) EVENT_LOG_v1 {"time_micros": 1771840296365983, "job": 20, "event": "compaction_finished", "compaction_time_micros": 111375, "compaction_time_cpu_micros": 49515, "output_level": 6, "num_output_files": 1, "total_output_size": 17350766, "num_input_records": 12569, "num_output_records": 12054, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296366162, "job": 20, "event": "table_file_deletion", "file_number": 38}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296368918, "job": 20, "event": "table_file_deletion", "file_number": 36}
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.252538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368998) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.369002) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:51:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:36.896 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:37 np0005626463.localdomain ceph-mon[294160]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:51:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:51:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:51:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:51:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:51:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18286 "" "Go-http-client/1.1"
Feb 23 09:51:40 np0005626463.localdomain ceph-mon[294160]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:41.898 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:42 np0005626463.localdomain ceph-mon[294160]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:51:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:51:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:51:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:51:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:51:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:51:44 np0005626463.localdomain ceph-mon[294160]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:51:45 np0005626463.localdomain systemd[1]: tmp-crun.f64STc.mount: Deactivated successfully.
Feb 23 09:51:45 np0005626463.localdomain podman[307583]: 2026-02-23 09:51:45.90740763 +0000 UTC m=+0.083496210 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:51:45 np0005626463.localdomain podman[307583]: 2026-02-23 09:51:45.921306086 +0000 UTC m=+0.097394676 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:51:45 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:51:46 np0005626463.localdomain ceph-mon[294160]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:46.901 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:47 np0005626463.localdomain ceph-mon[294160]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:51:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:51:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:51:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:51:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:51:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:51:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:51:48 np0005626463.localdomain systemd[1]: tmp-crun.CAY8qW.mount: Deactivated successfully.
Feb 23 09:51:48 np0005626463.localdomain podman[307607]: 2026-02-23 09:51:48.913423088 +0000 UTC m=+0.086943766 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, version=9.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:51:48 np0005626463.localdomain podman[307607]: 2026-02-23 09:51:48.929400979 +0000 UTC m=+0.102921647 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Feb 23 09:51:48 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:51:50 np0005626463.localdomain ceph-mon[294160]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:51.903 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:52 np0005626463.localdomain ceph-mon[294160]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:54 np0005626463.localdomain ceph-mon[294160]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:56 np0005626463.localdomain ceph-mon[294160]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:51:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:51:56.905 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:51:57 np0005626463.localdomain ceph-mon[294160]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:51:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:51:59 np0005626463.localdomain systemd[1]: tmp-crun.GYOjMW.mount: Deactivated successfully.
Feb 23 09:51:59 np0005626463.localdomain podman[307627]: 2026-02-23 09:51:59.91303019 +0000 UTC m=+0.088550346 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:51:59 np0005626463.localdomain podman[307628]: 2026-02-23 09:51:59.974902087 +0000 UTC m=+0.144292064 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:51:59 np0005626463.localdomain podman[307628]: 2026-02-23 09:51:59.98642326 +0000 UTC m=+0.155813237 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:51:59 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:52:00 np0005626463.localdomain podman[307627]: 2026-02-23 09:52:00.001770101 +0000 UTC m=+0.177290207 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 09:52:00 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:52:00 np0005626463.localdomain ceph-mon[294160]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:01.907 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:02 np0005626463.localdomain ceph-mon[294160]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:52:02 np0005626463.localdomain podman[307675]: 2026-02-23 09:52:02.90503371 +0000 UTC m=+0.081208391 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 23 09:52:02 np0005626463.localdomain podman[307675]: 2026-02-23 09:52:02.915198282 +0000 UTC m=+0.091373003 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:52:02 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:52:04 np0005626463.localdomain ceph-mon[294160]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1652933032' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:52:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1652933032' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:52:06 np0005626463.localdomain ceph-mon[294160]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:52:06 np0005626463.localdomain podman[307694]: 2026-02-23 09:52:06.909156489 +0000 UTC m=+0.081210801 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:52:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:06.909 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:06 np0005626463.localdomain podman[307694]: 2026-02-23 09:52:06.922309532 +0000 UTC m=+0.094363834 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 23 09:52:06 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:52:07 np0005626463.localdomain ceph-mon[294160]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:52:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:52:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:52:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1"
Feb 23 09:52:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:52:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18285 "" "Go-http-client/1.1"
Feb 23 09:52:10 np0005626463.localdomain ceph-mon[294160]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:11.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:12 np0005626463.localdomain ceph-mon[294160]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:12 np0005626463.localdomain sudo[307712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:52:12 np0005626463.localdomain sudo[307712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:52:12 np0005626463.localdomain sudo[307712]: pam_unix(sudo:session): session closed for user root
Feb 23 09:52:12 np0005626463.localdomain sudo[307730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:52:12 np0005626463.localdomain sudo[307730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:52:13 np0005626463.localdomain sudo[307730]: pam_unix(sudo:session): session closed for user root
Feb 23 09:52:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:52:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:52:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:52:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:52:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:52:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:52:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:52:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:52:13 np0005626463.localdomain sudo[307779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:52:13 np0005626463.localdomain sudo[307779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:52:13 np0005626463.localdomain sudo[307779]: pam_unix(sudo:session): session closed for user root
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4174244319' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:52:14 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/807576527' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:52:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:52:16 np0005626463.localdomain ceph-mon[294160]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:16 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:52:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1031089501' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:52:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:16.914 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:52:16 np0005626463.localdomain podman[307797]: 2026-02-23 09:52:16.919911161 +0000 UTC m=+0.088198335 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:52:16 np0005626463.localdomain podman[307797]: 2026-02-23 09:52:16.935320313 +0000 UTC m=+0.103607467 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:52:16 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:52:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/490166560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:18 np0005626463.localdomain ceph-mon[294160]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.394 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.394 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.395 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:52:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:52:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:19.386 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:52:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:19.403 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:52:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:19.403 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:52:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:19.404 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:19 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:52:19 np0005626463.localdomain podman[307819]: 2026-02-23 09:52:19.912426597 +0000 UTC m=+0.082275553 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 23 09:52:19 np0005626463.localdomain podman[307819]: 2026-02-23 09:52:19.929633835 +0000 UTC m=+0.099482791 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Feb 23 09:52:19 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:52:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:52:20 np0005626463.localdomain ceph-mon[294160]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.919 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:52:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:21.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:22 np0005626463.localdomain sshd[307839]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:52:22 np0005626463.localdomain ceph-mon[294160]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:22 np0005626463.localdomain sshd[307839]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:52:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:24 np0005626463.localdomain ceph-mon[294160]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:52:25 np0005626463.localdomain ceph-mon[294160]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:52:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2038369328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.511 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.578 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.579 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.777 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.779 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11734MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.779 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.780 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.854 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.854 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.855 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:52:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:25.900 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:52:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2038369328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:52:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1701877501' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.460 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.466 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.484 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.487 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.487 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:52:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:26.922 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1701877501' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:52:27 np0005626463.localdomain ceph-mon[294160]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:30 np0005626463.localdomain ceph-mon[294160]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:30.300 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:30.301 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:52:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:30.302 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:52:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:52:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:52:30 np0005626463.localdomain podman[307886]: 2026-02-23 09:52:30.916986528 +0000 UTC m=+0.087625047 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:52:30 np0005626463.localdomain podman[307886]: 2026-02-23 09:52:30.954500879 +0000 UTC m=+0.125139438 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:52:30 np0005626463.localdomain podman[307885]: 2026-02-23 09:52:30.967323542 +0000 UTC m=+0.138520728 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:52:30 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:52:31 np0005626463.localdomain podman[307885]: 2026-02-23 09:52:31.081235834 +0000 UTC m=+0.252432950 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:52:31 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:52:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:31.925 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:32 np0005626463.localdomain ceph-mon[294160]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:32.305 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:52:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:32.760 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpu3w1xssi/privsep.sock']
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.383 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.264 307935 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.269 307935 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.273 307935 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.273 307935 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307935
Feb 23 09:52:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:52:33 np0005626463.localdomain podman[307940]: 2026-02-23 09:52:33.909122412 +0000 UTC m=+0.082194551 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:52:33 np0005626463.localdomain podman[307940]: 2026-02-23 09:52:33.922207563 +0000 UTC m=+0.095279702 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:52:33 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:52:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.968 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpmznh5rez/privsep.sock']
Feb 23 09:52:34 np0005626463.localdomain ceph-mon[294160]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.648 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:52:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.508 307963 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:52:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.514 307963 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:52:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.517 307963 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 23 09:52:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.518 307963 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307963
Feb 23 09:52:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 do_prune osdmap full prune enabled
Feb 23 09:52:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 e87: 6 total, 6 up, 6 in
Feb 23 09:52:35 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e87: 6 total, 6 up, 6 in
Feb 23 09:52:35 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:35.560 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpu8soh03c/privsep.sock']
Feb 23 09:52:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.159 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 23 09:52:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.058 307975 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 23 09:52:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.062 307975 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 23 09:52:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.066 307975 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 23 09:52:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.066 307975 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307975
Feb 23 09:52:36 np0005626463.localdomain ceph-mon[294160]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:36 np0005626463.localdomain ceph-mon[294160]: osdmap e87: 6 total, 6 up, 6 in
Feb 23 09:52:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:36.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 do_prune osdmap full prune enabled
Feb 23 09:52:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 e88: 6 total, 6 up, 6 in
Feb 23 09:52:37 np0005626463.localdomain ceph-mon[294160]: pgmap v50: 177 pgs: 177 active+clean; 133 MiB data, 653 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 2.8 MiB/s wr, 29 op/s
Feb 23 09:52:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in
Feb 23 09:52:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e45: np0005626465.hlpkwo(active, since 92s), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:52:37 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:37.593 265541 INFO neutron.agent.linux.ip_lib [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Device tapa6e249e9-2a cannot be used as it has no MAC address
Feb 23 09:52:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:37.708 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain kernel: device tapa6e249e9-2a entered promiscuous mode
Feb 23 09:52:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:52:37Z|00074|binding|INFO|Claiming lport a6e249e9-2ac1-4cf8-814c-e468492579ad for this chassis.
Feb 23 09:52:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:52:37Z|00075|binding|INFO|a6e249e9-2ac1-4cf8-814c-e468492579ad: Claiming unknown
Feb 23 09:52:37 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840357.7236] manager: (tapa6e249e9-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/17)
Feb 23 09:52:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:37.723 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain systemd-udevd[307991]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:52:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:52:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:37.736 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-f52ac7ca-e197-490d-a7bf-412806b20437', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f52ac7ca-e197-490d-a7bf-412806b20437', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00c2d7924384b97b57547b4797141dc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d438fe50-32ab-4d61-865c-d5c18259e35d, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a6e249e9-2ac1-4cf8-814c-e468492579ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:52:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:37.737 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a6e249e9-2ac1-4cf8-814c-e468492579ad in datapath f52ac7ca-e197-490d-a7bf-412806b20437 bound to our chassis
Feb 23 09:52:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:37.740 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 38c2bb38-6d42-43ab-8465-bf9a5286871e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:52:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:37.740 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f52ac7ca-e197-490d-a7bf-412806b20437, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:52:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:37.743 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc62bd7-f464-4f55-83b9-34aeadc04de3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 23 09:52:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:52:37Z|00076|binding|INFO|Setting lport a6e249e9-2ac1-4cf8-814c-e468492579ad ovn-installed in OVS
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: hostname: np0005626463.localdomain
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:52:37Z|00077|binding|INFO|Setting lport a6e249e9-2ac1-4cf8-814c-e468492579ad up in Southbound
Feb 23 09:52:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:37.753 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device
Feb 23 09:52:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:37.802 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:37.827 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:37 np0005626463.localdomain systemd[1]: tmp-crun.VOH4Jg.mount: Deactivated successfully.
Feb 23 09:52:37 np0005626463.localdomain podman[307993]: 2026-02-23 09:52:37.875765623 +0000 UTC m=+0.129144291 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:52:37 np0005626463.localdomain podman[307993]: 2026-02-23 09:52:37.880691244 +0000 UTC m=+0.134069892 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:52:37 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:52:38 np0005626463.localdomain ceph-mon[294160]: osdmap e88: 6 total, 6 up, 6 in
Feb 23 09:52:38 np0005626463.localdomain ceph-mon[294160]: mgrmap e45: np0005626465.hlpkwo(active, since 92s), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 09:52:38 np0005626463.localdomain podman[308083]: 
Feb 23 09:52:38 np0005626463.localdomain podman[308083]: 2026-02-23 09:52:38.758913939 +0000 UTC m=+0.098382828 container create f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:52:38 np0005626463.localdomain podman[308083]: 2026-02-23 09:52:38.707285626 +0000 UTC m=+0.046754535 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:52:38 np0005626463.localdomain systemd[1]: Started libpod-conmon-f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11.scope.
Feb 23 09:52:38 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:52:38 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0584738fbc7368073bf3c71d73f757000942af3fc730583c035cdde3d707c677/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:52:38 np0005626463.localdomain podman[308083]: 2026-02-23 09:52:38.847589437 +0000 UTC m=+0.187058316 container init f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:52:38 np0005626463.localdomain podman[308083]: 2026-02-23 09:52:38.85615665 +0000 UTC m=+0.195625529 container start f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 23 09:52:38 np0005626463.localdomain dnsmasq[308101]: started, version 2.85 cachesize 150
Feb 23 09:52:38 np0005626463.localdomain dnsmasq[308101]: DNS service limited to local subnets
Feb 23 09:52:38 np0005626463.localdomain dnsmasq[308101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:52:38 np0005626463.localdomain dnsmasq[308101]: warning: no upstream servers configured
Feb 23 09:52:38 np0005626463.localdomain dnsmasq-dhcp[308101]: DHCP, static leases only on 192.168.199.0, lease time 1d
Feb 23 09:52:38 np0005626463.localdomain dnsmasq[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/addn_hosts - 0 addresses
Feb 23 09:52:38 np0005626463.localdomain dnsmasq-dhcp[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/host
Feb 23 09:52:38 np0005626463.localdomain dnsmasq-dhcp[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/opts
Feb 23 09:52:39 np0005626463.localdomain ceph-mon[294160]: pgmap v52: 177 pgs: 177 active+clean; 133 MiB data, 653 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 3.6 MiB/s wr, 36 op/s
Feb 23 09:52:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:52:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:52:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:52:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 09:52:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:52:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Feb 23 09:52:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:52:39.981 265541 INFO neutron.agent.dhcp.agent [None req-970079b0-9620-418a-9f8a-2f59947e4e0c - - - - - -] DHCP configuration for ports {'4d46b2db-eaf9-4e89-bf4e-88bee8d2ccef'} is completed
Feb 23 09:52:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:41.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:41.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:42 np0005626463.localdomain ceph-mon[294160]: pgmap v53: 177 pgs: 177 active+clean; 145 MiB data, 694 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Feb 23 09:52:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:52:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:52:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:52:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:52:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:52:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:52:44 np0005626463.localdomain ceph-mon[294160]: pgmap v54: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Feb 23 09:52:46 np0005626463.localdomain ceph-mon[294160]: pgmap v55: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 4.1 MiB/s wr, 38 op/s
Feb 23 09:52:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:46.933 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:46.937 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:47 np0005626463.localdomain ceph-mon[294160]: pgmap v56: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 6.2 KiB/s rd, 1.2 MiB/s wr, 8 op/s
Feb 23 09:52:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:52:47 np0005626463.localdomain podman[308103]: 2026-02-23 09:52:47.921527408 +0000 UTC m=+0.088866915 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:52:47 np0005626463.localdomain podman[308103]: 2026-02-23 09:52:47.964562398 +0000 UTC m=+0.131901875 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:52:47 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:52:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:48.554 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:52:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:52:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:52:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:52:50 np0005626463.localdomain ceph-mon[294160]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 5.3 KiB/s rd, 1.1 MiB/s wr, 7 op/s
Feb 23 09:52:50 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:52:50 np0005626463.localdomain podman[308127]: 2026-02-23 09:52:50.920579414 +0000 UTC m=+0.088749002 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.buildah.version=1.33.7, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal)
Feb 23 09:52:50 np0005626463.localdomain podman[308127]: 2026-02-23 09:52:50.965364217 +0000 UTC m=+0.133533785 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 09:52:50 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:52:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:51.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:51.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:52 np0005626463.localdomain ceph-mon[294160]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 5.2 KiB/s rd, 1.0 MiB/s wr, 7 op/s
Feb 23 09:52:53 np0005626463.localdomain ceph-mon[294160]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:56 np0005626463.localdomain ceph-mon[294160]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.136 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.149 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a403f99f-cf8c-43ce-91f8-4976707ea483', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.137615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db23bfe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '6c48a94920d0213268430a5cc86cbc02cb0fb30f420f90f88f8ff10d2c1e8497'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.137615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db24edc-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': 'ab61ed7628a9bbe58b4201933bb5cce59891bef55808c9c09ffdcd4155338ccc'}]}, 'timestamp': '2026-02-23 09:52:56.151093', '_unique_id': 'c0c110997904434c8c24adf7576e625d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e584afda-a9e7-4d91-afb6-8fd389c5d023', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.154149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db2da3c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '978ff5a49e58209fba251079a9a93872994fdca4d7b9b72556068484ed874b93'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.154149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db2ea9a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '823ff808c19f2c5d7876cba172d18e6052b268bce3e6c3e76efb64d2b48058b8'}]}, 'timestamp': '2026-02-23 09:52:56.155073', '_unique_id': 'fb21675608c143739754fa1b19fd4a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.185 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6621c95f-3213-4f48-b4f9-bc5de10fa92b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.157306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db7b796-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '841ac17f2e952ee0c776cbc5f4fca4f8b742307ab3a45caa419a2fc9850d3c82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.157306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db7c998-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '77983e0af840e6b53e233aac95af7cb465a42e4759328cb942948795466fe07b'}]}, 'timestamp': '2026-02-23 09:52:56.187020', '_unique_id': 'd58893ee9fa746be8389dc0fefba112c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '386f5ba1-ed5d-48ff-94a9-37f9a653ea18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.189684', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6db8c366-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c6e76c09462620436629b8e16153a3547f98604c3471bd296d7253a032ffc0cd'}]}, 'timestamp': '2026-02-23 09:52:56.193352', '_unique_id': '9e72e5f3bb1a4dd68b8dbe46ecd1397b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d502be5-94f9-4da0-94bc-a977af167870', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.195577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db92d06-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '2e4d1cba71708a7d3f8d8fcdd2e27ab4ce62f9bee3bc5d96fa2150967c5b986c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.195577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db942fa-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '42d8fab9ba00abc825586744d7b692aceebc44f0ba23935fb5de0711b4bb182a'}]}, 'timestamp': '2026-02-23 09:52:56.196589', '_unique_id': '85cf677e87d640b7b344ab345ad7548e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd99a8241-67db-48a3-8177-ea9b24b84302', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.198830', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6db9ad62-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c0995ccbf590bfb08538ffdc6443a0c5cc4ee5e01be8bcab81e915c76c3ea8b9'}]}, 'timestamp': '2026-02-23 09:52:56.199334', '_unique_id': 'cb73d5b15b3b488f8e1c51bdee72d250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18b8400-ca80-454e-8a4e-785b1e9a3b48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.201764', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dba1fe0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '4546ee1f5d35d0115e92a3b0572b8878211c4071daaffb6505c870f0c2bc7a1f'}]}, 'timestamp': '2026-02-23 09:52:56.202264', '_unique_id': 'f91f54c49b49492a9cb657651d89aef7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b96b9796-14b3-4ad5-8600-298cdb5868da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.204373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dba83fe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '76bd46dc750382bf2e87974ee1bc60b1a24a8de96219a290b7a23ab2db583875'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.204373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dba972c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '9b06c6d9b51409139bb9a015e53c5eec2372085d0c942f7debcc2538c26150c0'}]}, 'timestamp': '2026-02-23 09:52:56.205291', '_unique_id': 'd3c5402438564833b157a5b553ef1b8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5acfa468-eee8-4f15-824b-dba0e794d0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.207513', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbaff0a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '7d3accf3f0a7e28b289d4052378e384c3abd10c003771e77171354757fa7a68e'}]}, 'timestamp': '2026-02-23 09:52:56.208013', '_unique_id': '894671bdf0e34bd592aea32f98ae5d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dc1271d-da02-42de-8439-8f52324b0d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.210143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dbb6576-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '14ca091eb53f86ae5c1411dff702a54cc1a7f0780b4240a1a4993d5274312dc0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.210143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dbb75f2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '1eaa722fd9381772b6afbfb5b6338be14f238ba0543205d0e3ff1286f416f3dc'}]}, 'timestamp': '2026-02-23 09:52:56.211029', '_unique_id': 'cd304300f34c4b18918d532d418d6751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd85fe9d-449b-480a-b853-915e5f1380f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.213198', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbbdcf4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'b1def77732d128eb3dc21dde6449edfda377876fe628b6406325e7aa1fb6f8a7'}]}, 'timestamp': '2026-02-23 09:52:56.213655', '_unique_id': '509774196d13445cb4d06d9616704c56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.216 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0df52c24-b4a8-4e5d-a2e5-bfc2679fa5ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.215740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dbc416c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '2147ae0344d6b13e64ac5fc95040f4e9439125e9a0561f4660b8a31a39646540'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.215740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dbc542c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '431874d2453e2d6a570547692b73242f03d8db7cfb42313e58c6b0d404a1de0b'}]}, 'timestamp': '2026-02-23 09:52:56.216716', '_unique_id': '18ea33ae83894cd89294216ba1a3ae7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14474806-b793-4bf5-b24b-2cbd43f16ace', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.218823', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbcb9da-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '9f24032eb2ed2d3baad3631733c6dc2b8c28b5d44554c4ddf34056268cb28c1d'}]}, 'timestamp': '2026-02-23 09:52:56.219309', '_unique_id': '64d5cfda42574d84bffad74186b70a60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.221 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a4f49d3-e09c-4136-93a1-a4a78d032e81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:52:56.221393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6dbfc72e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.428154928, 'message_signature': 'a8262a8b95bf422dbb974a22fa63b2f5016202586facccf483c38a3577476998'}]}, 'timestamp': '2026-02-23 09:52:56.239307', '_unique_id': 'd13a3807f03147fbb26b5e2d28af895c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ead0b49-2547-46db-a196-25eef3141b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.241470', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc02d22-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c60d285185e7b1a6a5c74c1cbbde595997742586e99559e3536000156b6804ee'}]}, 'timestamp': '2026-02-23 09:52:56.241952', '_unique_id': '4e68a67fe8614e8eb4f5a8c9b116b509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da5cc68-f293-4cc9-b5bc-2161baf76c4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.244196', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc097e4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '8e1c4baeca59843ce48eedba6abe47b85861581f9730ac7defa690d6e78634cb'}]}, 'timestamp': '2026-02-23 09:52:56.244659', '_unique_id': 'ad884156fec545b7a4fc4a1f4c5bba9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dacf9d90-83b4-4075-8727-b8d51a730c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.246736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc0fbf8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '75c9d44e7c1500f54d3a2abf49c5b4f6532108328716542a2c2b708578b97a0c'}]}, 'timestamp': '2026-02-23 09:52:56.247214', '_unique_id': '74d7bd41780d48bb89d99b9f1f7e0c7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 12580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '793bb3f7-8238-4dd8-8623-a3a628fe8345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12580000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:52:56.249558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6dc16908-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.428154928, 'message_signature': 'f0768357bb1628766dfa4fe85467acc72904e86a5c4502f83d6f8007a9c7bf6a'}]}, 'timestamp': '2026-02-23 09:52:56.250026', '_unique_id': '55a0d6a3f8f147c5b14ff51656a3d97b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28fb9d72-697c-45c5-9e71-d2d73b9de224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.252119', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc1cd12-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'b8bc347771d4511549327bb73ad0e9f3f0d33903280a702b9cf37224fb7b819e'}]}, 'timestamp': '2026-02-23 09:52:56.252571', '_unique_id': '00c0abcd65514d51a3cc4d5269642e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b644b6-55d9-4c1c-a679-e7b8cd8c923a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.254821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dc238c4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': 'ebd2a0487b074eabd1af949976d0a6659ae72de830e4cd9c7b85f87891cf3dfd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.254821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dc2492c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '6868fbebe194e89cf1fc9e3d4445bcd321ba23f21b7e802bc25e8a63ffa53baf'}]}, 'timestamp': '2026-02-23 09:52:56.255722', '_unique_id': '52c830864f4e41cdad9e7c3c7a12a2f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c35896ca-df79-4f07-b2d0-3a9fdcdc8d03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.257862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dc2aeda-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '5d68c482bae114b9820a8f95d5a1d2c847f33ce90a3dc89bf87cce5433e3fcc8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.257862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dc2bf24-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '489d8e93b6dbf0c69486950b13ba647544ce93b6611302468d9bd2a0b42f6d62'}]}, 'timestamp': '2026-02-23 09:52:56.258737', '_unique_id': 'db876bfeae4e4e4e9877c87039a49a1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:52:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:52:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:52:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:56.938 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:52:56.941 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:52:57 np0005626463.localdomain ceph-mon[294160]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:52:59 np0005626463.localdomain ceph-mon[294160]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:53:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:53:01 np0005626463.localdomain podman[308149]: 2026-02-23 09:53:01.906916666 +0000 UTC m=+0.081670696 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:53:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:01.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:01 np0005626463.localdomain podman[308150]: 2026-02-23 09:53:01.966158842 +0000 UTC m=+0.138012243 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:53:01 np0005626463.localdomain podman[308149]: 2026-02-23 09:53:01.969813474 +0000 UTC m=+0.144567484 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:53:01 np0005626463.localdomain podman[308150]: 2026-02-23 09:53:01.973522577 +0000 UTC m=+0.145375968 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:53:01 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:53:02 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:53:02 np0005626463.localdomain ceph-mon[294160]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:04 np0005626463.localdomain ceph-mon[294160]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2214862826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:53:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2214862826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:53:04 np0005626463.localdomain sshd[308197]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:53:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:53:04 np0005626463.localdomain podman[308199]: 2026-02-23 09:53:04.910571623 +0000 UTC m=+0.086631497 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 23 09:53:04 np0005626463.localdomain podman[308199]: 2026-02-23 09:53:04.924311434 +0000 UTC m=+0.100371328 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 09:53:04 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:53:05 np0005626463.localdomain sshd[308197]: Connection closed by 45.148.10.121 port 44374 [preauth]
Feb 23 09:53:06 np0005626463.localdomain ceph-mon[294160]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:06.943 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:07 np0005626463.localdomain ceph-mon[294160]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:07 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:07Z|00078|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 23 09:53:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:08.536 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:53:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:08.537 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:53:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:08.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:53:08 np0005626463.localdomain podman[308219]: 2026-02-23 09:53:08.903094557 +0000 UTC m=+0.078190118 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:53:08 np0005626463.localdomain podman[308219]: 2026-02-23 09:53:08.936550033 +0000 UTC m=+0.111645594 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 09:53:08 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:53:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:53:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:53:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:53:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 09:53:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:53:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1"
Feb 23 09:53:10 np0005626463.localdomain ceph-mon[294160]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.318457) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391318495, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1188, "num_deletes": 256, "total_data_size": 1290539, "memory_usage": 1312240, "flush_reason": "Manual Compaction"}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391327057, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1267672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22845, "largest_seqno": 24032, "table_properties": {"data_size": 1262533, "index_size": 2610, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11142, "raw_average_key_size": 19, "raw_value_size": 1251999, "raw_average_value_size": 2188, "num_data_blocks": 116, "num_entries": 572, "num_filter_entries": 572, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840296, "oldest_key_time": 1771840296, "file_creation_time": 1771840391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8647 microseconds, and 3883 cpu microseconds.
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.327104) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1267672 bytes OK
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.327128) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331161) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331179) EVENT_LOG_v1 {"time_micros": 1771840391331173, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331199) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1285123, prev total WAL file size 1285447, number of live WAL files 2.
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331916) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373638' seq:72057594037927935, type:22 .. '6C6F676D0034303230' seq:0, type:0; will stop at (end)
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1237KB)], [39(16MB)]
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391331979, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 18618438, "oldest_snapshot_seqno": -1}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12090 keys, 18507129 bytes, temperature: kUnknown
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391439850, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18507129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18438120, "index_size": 37676, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30277, "raw_key_size": 325082, "raw_average_key_size": 26, "raw_value_size": 18231880, "raw_average_value_size": 1508, "num_data_blocks": 1437, "num_entries": 12090, "num_filter_entries": 12090, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.440227) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18507129 bytes
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.441846) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.4 rd, 171.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.5 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(29.3) write-amplify(14.6) OK, records in: 12626, records dropped: 536 output_compression: NoCompression
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.441897) EVENT_LOG_v1 {"time_micros": 1771840391441863, "job": 22, "event": "compaction_finished", "compaction_time_micros": 107996, "compaction_time_cpu_micros": 51197, "output_level": 6, "num_output_files": 1, "total_output_size": 18507129, "num_input_records": 12626, "num_output_records": 12090, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391442210, "job": 22, "event": "table_file_deletion", "file_number": 41}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391444741, "job": 22, "event": "table_file_deletion", "file_number": 39}
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444898) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:53:11 np0005626463.localdomain sshd[308237]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:53:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:11.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:12 np0005626463.localdomain sshd[308237]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:53:12 np0005626463.localdomain ceph-mon[294160]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:53:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:53:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:53:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:53:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:53:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:53:13 np0005626463.localdomain sudo[308239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:53:13 np0005626463.localdomain sudo[308239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:53:13 np0005626463.localdomain sudo[308239]: pam_unix(sudo:session): session closed for user root
Feb 23 09:53:13 np0005626463.localdomain sudo[308257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 09:53:13 np0005626463.localdomain sudo[308257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain sudo[308257]: pam_unix(sudo:session): session closed for user root
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 09:53:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:14 np0005626463.localdomain sudo[308295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:53:14 np0005626463.localdomain sudo[308295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:53:14 np0005626463.localdomain sudo[308295]: pam_unix(sudo:session): session closed for user root
Feb 23 09:53:14 np0005626463.localdomain sudo[308313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:53:14 np0005626463.localdomain sudo[308313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:53:14 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:14.540 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:53:15 np0005626463.localdomain sudo[308313]: pam_unix(sudo:session): session closed for user root
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:53:15 np0005626463.localdomain sudo[308363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:53:15 np0005626463.localdomain sudo[308363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:53:15 np0005626463.localdomain sudo[308363]: pam_unix(sudo:session): session closed for user root
Feb 23 09:53:16 np0005626463.localdomain ceph-mon[294160]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4261639450' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2906954021' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.947 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.948 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.950 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:53:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:16.953 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/594327541' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:17 np0005626463.localdomain ceph-mon[294160]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1515000314' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.489 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.489 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.490 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.654 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.655 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.655 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:53:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:18.656 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:53:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:53:18 np0005626463.localdomain systemd[1]: tmp-crun.Hhwyue.mount: Deactivated successfully.
Feb 23 09:53:18 np0005626463.localdomain podman[308381]: 2026-02-23 09:53:18.923071214 +0000 UTC m=+0.094691184 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:53:18 np0005626463.localdomain podman[308381]: 2026-02-23 09:53:18.935262187 +0000 UTC m=+0.106882167 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:53:18 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:53:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:19.093 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:53:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:19.145 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:53:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:19.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:53:20 np0005626463.localdomain ceph-mon[294160]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:53:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.074 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.075 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.075 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:53:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:53:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:53:21 np0005626463.localdomain podman[308404]: 2026-02-23 09:53:21.947919671 +0000 UTC m=+0.076862448 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git)
Feb 23 09:53:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:21.955 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:21 np0005626463.localdomain podman[308404]: 2026-02-23 09:53:21.96357699 +0000 UTC m=+0.092519737 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, release=1770267347, version=9.7, vcs-type=git, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Feb 23 09:53:21 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:53:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:22.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:22 np0005626463.localdomain ceph-mon[294160]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:23.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:24 np0005626463.localdomain ceph-mon[294160]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:53:26 np0005626463.localdomain ceph-mon[294160]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4027208213' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:53:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2615064459' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.526 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.781 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11453MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.858 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.904 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:53:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:26.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:53:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:53:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2212968234' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:27.333 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:53:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:27.341 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:53:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2615064459' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:27 np0005626463.localdomain ceph-mon[294160]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2212968234' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:27.588 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:53:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:27.591 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:53:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:27.591 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:53:30 np0005626463.localdomain ceph-mon[294160]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:53:31 np0005626463.localdomain ceph-mon[294160]: pgmap v78: 177 pgs: 177 active+clean; 148 MiB data, 728 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 176 KiB/s wr, 11 op/s
Feb 23 09:53:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:31.960 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:53:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:31.962 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:53:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:31.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:53:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:31.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:53:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:32.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:32.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:53:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/410565884' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:53:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/515893759' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:53:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:53:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:53:32 np0005626463.localdomain podman[308472]: 2026-02-23 09:53:32.922166071 +0000 UTC m=+0.089649930 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:53:32 np0005626463.localdomain podman[308471]: 2026-02-23 09:53:32.963811268 +0000 UTC m=+0.133944698 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0)
Feb 23 09:53:32 np0005626463.localdomain podman[308472]: 2026-02-23 09:53:32.983948205 +0000 UTC m=+0.151432084 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:53:32 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:53:33 np0005626463.localdomain podman[308471]: 2026-02-23 09:53:33.030322897 +0000 UTC m=+0.200456297 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Feb 23 09:53:33 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:53:33 np0005626463.localdomain ceph-mon[294160]: pgmap v79: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Feb 23 09:53:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:35.755 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:53:35 np0005626463.localdomain systemd[1]: tmp-crun.3qLLqj.mount: Deactivated successfully.
Feb 23 09:53:35 np0005626463.localdomain podman[308520]: 2026-02-23 09:53:35.922845848 +0000 UTC m=+0.093491377 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:53:35 np0005626463.localdomain podman[308520]: 2026-02-23 09:53:35.95812505 +0000 UTC m=+0.128770569 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216)
Feb 23 09:53:35 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:53:36 np0005626463.localdomain ceph-mon[294160]: pgmap v80: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Feb 23 09:53:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:37.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:37 np0005626463.localdomain ceph-mon[294160]: pgmap v81: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Feb 23 09:53:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:37.859 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:53:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:53:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:53:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 09:53:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:53:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1"
Feb 23 09:53:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:53:39 np0005626463.localdomain podman[308538]: 2026-02-23 09:53:39.905238473 +0000 UTC m=+0.082410368 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 23 09:53:39 np0005626463.localdomain podman[308538]: 2026-02-23 09:53:39.938353728 +0000 UTC m=+0.115525583 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 23 09:53:39 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:53:40 np0005626463.localdomain ceph-mon[294160]: pgmap v82: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Feb 23 09:53:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:42 np0005626463.localdomain ceph-mon[294160]: pgmap v83: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.314 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:42 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:42.806 265541 INFO neutron.agent.linux.ip_lib [None req-3eef3e80-071c-4985-92db-d392e5623e8f - - - - - -] Device tapff39110f-d5 cannot be used as it has no MAC address
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.862 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:42 np0005626463.localdomain kernel: device tapff39110f-d5 entered promiscuous mode
Feb 23 09:53:42 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:42Z|00079|binding|INFO|Claiming lport ff39110f-d5ab-4f4c-b656-11139ee6c196 for this chassis.
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.869 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:42 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:42Z|00080|binding|INFO|ff39110f-d5ab-4f4c-b656-11139ee6c196: Claiming unknown
Feb 23 09:53:42 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840422.8709] manager: (tapff39110f-d5): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 23 09:53:42 np0005626463.localdomain systemd-udevd[308566]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:53:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:42.890 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02917a1d904f4889b9e244e1ebfc57ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fe629c8-1dc0-4c84-9b5b-6b0d444ac4ee, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=ff39110f-d5ab-4f4c-b656-11139ee6c196) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:53:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:42.893 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ff39110f-d5ab-4f4c-b656-11139ee6c196 in datapath b3238cd9-9eb9-4ae1-bb2b-833536c18deb bound to our chassis
Feb 23 09:53:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:42.897 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port a8a7da51-0c94-4438-a1bc-0b45c4adff92 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:53:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:42.898 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3238cd9-9eb9-4ae1-bb2b-833536c18deb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:42.899 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d789b269-be6e-4aad-b1e7-15938a37f976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:42Z|00081|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 ovn-installed in OVS
Feb 23 09:53:42 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:42Z|00082|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 up in Southbound
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.908 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapff39110f-d5: No such device
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.942 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:42.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:53:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:53:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:53:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:53:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:53:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:53:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:43.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:43 np0005626463.localdomain podman[308638]: 
Feb 23 09:53:43 np0005626463.localdomain podman[308638]: 2026-02-23 09:53:43.807470459 +0000 UTC m=+0.084792782 container create b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:53:43 np0005626463.localdomain systemd[1]: Started libpod-conmon-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope.
Feb 23 09:53:43 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:53:43 np0005626463.localdomain podman[308638]: 2026-02-23 09:53:43.767007257 +0000 UTC m=+0.044329651 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:53:43 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aecf83ec51d273f51dfbd2c3054b8e6328f27606ae999dc013ec2a626bc3eb75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:53:43 np0005626463.localdomain podman[308638]: 2026-02-23 09:53:43.880353343 +0000 UTC m=+0.157675656 container init b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:53:43 np0005626463.localdomain podman[308638]: 2026-02-23 09:53:43.889450982 +0000 UTC m=+0.166773305 container start b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 09:53:43 np0005626463.localdomain dnsmasq[308657]: started, version 2.85 cachesize 150
Feb 23 09:53:43 np0005626463.localdomain dnsmasq[308657]: DNS service limited to local subnets
Feb 23 09:53:43 np0005626463.localdomain dnsmasq[308657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:53:43 np0005626463.localdomain dnsmasq[308657]: warning: no upstream servers configured
Feb 23 09:53:43 np0005626463.localdomain dnsmasq-dhcp[308657]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:53:43 np0005626463.localdomain dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 0 addresses
Feb 23 09:53:43 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host
Feb 23 09:53:43 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts
Feb 23 09:53:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.026 265541 INFO neutron.agent.dhcp.agent [None req-dab2c9f8-4c4d-41a8-847d-d075e54f936e - - - - - -] DHCP configuration for ports {'2934a6f5-43b8-45c4-9f75-838564def4b3'} is completed
Feb 23 09:53:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 do_prune osdmap full prune enabled
Feb 23 09:53:44 np0005626463.localdomain ceph-mon[294160]: pgmap v84: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.6 MiB/s wr, 95 op/s
Feb 23 09:53:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e89 e89: 6 total, 6 up, 6 in
Feb 23 09:53:44 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 23 09:53:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.305 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:43Z, description=, device_id=6ce1665d-3eb2-47a2-bfdb-37d82bbd1318, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28293456a0>], id=af44d09a-4063-4295-a15e-3aae2c1b49de, ip_allocation=immediate, mac_address=fa:16:3e:25:86:49, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:40Z, description=, dns_domain=, id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-800312373-network, port_security_enabled=True, project_id=02917a1d904f4889b9e244e1ebfc57ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4993, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=372, status=ACTIVE, subnets=['376bac44-58af-4a22-9c21-46b42c4d09e0'], tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:41Z, vlan_transparent=None, network_id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, port_security_enabled=False, project_id=02917a1d904f4889b9e244e1ebfc57ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=380, status=DOWN, tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:44Z on network b3238cd9-9eb9-4ae1-bb2b-833536c18deb
Feb 23 09:53:44 np0005626463.localdomain dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 1 addresses
Feb 23 09:53:44 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host
Feb 23 09:53:44 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts
Feb 23 09:53:44 np0005626463.localdomain podman[308675]: 2026-02-23 09:53:44.561053072 +0000 UTC m=+0.066197161 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:53:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.801 265541 INFO neutron.agent.dhcp.agent [None req-9eadebd3-0920-4ae1-8127-d9fe2ae8b8fc - - - - - -] DHCP configuration for ports {'af44d09a-4063-4295-a15e-3aae2c1b49de'} is completed
Feb 23 09:53:45 np0005626463.localdomain ceph-mon[294160]: osdmap e89: 6 total, 6 up, 6 in
Feb 23 09:53:45 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:45.601 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:43Z, description=, device_id=6ce1665d-3eb2-47a2-bfdb-37d82bbd1318, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829351790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829351730>], id=af44d09a-4063-4295-a15e-3aae2c1b49de, ip_allocation=immediate, mac_address=fa:16:3e:25:86:49, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:40Z, description=, dns_domain=, id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-800312373-network, port_security_enabled=True, project_id=02917a1d904f4889b9e244e1ebfc57ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4993, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=372, status=ACTIVE, subnets=['376bac44-58af-4a22-9c21-46b42c4d09e0'], tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:41Z, vlan_transparent=None, network_id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, port_security_enabled=False, project_id=02917a1d904f4889b9e244e1ebfc57ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=380, status=DOWN, tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:44Z on network b3238cd9-9eb9-4ae1-bb2b-833536c18deb
Feb 23 09:53:45 np0005626463.localdomain dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 1 addresses
Feb 23 09:53:45 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host
Feb 23 09:53:45 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts
Feb 23 09:53:45 np0005626463.localdomain podman[308713]: 2026-02-23 09:53:45.828602403 +0000 UTC m=+0.064098256 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:53:45 np0005626463.localdomain systemd[1]: tmp-crun.vv3bTf.mount: Deactivated successfully.
Feb 23 09:53:46 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:46.065 265541 INFO neutron.agent.dhcp.agent [None req-e8fd42cd-b4ad-4e40-a14c-7297c64ce0e8 - - - - - -] DHCP configuration for ports {'af44d09a-4063-4295-a15e-3aae2c1b49de'} is completed
Feb 23 09:53:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e89 do_prune osdmap full prune enabled
Feb 23 09:53:46 np0005626463.localdomain ceph-mon[294160]: pgmap v86: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Feb 23 09:53:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 e90: 6 total, 6 up, 6 in
Feb 23 09:53:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in
Feb 23 09:53:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:47 np0005626463.localdomain ceph-mon[294160]: osdmap e90: 6 total, 6 up, 6 in
Feb 23 09:53:47 np0005626463.localdomain ceph-mon[294160]: pgmap v88: 177 pgs: 177 active+clean; 225 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 480 KiB/s rd, 3.2 MiB/s wr, 136 op/s
Feb 23 09:53:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:47.319 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:53:48.150 2 INFO neutron.agent.securitygroups_rpc [None req-a2d5984e-7e32-490c-a625-105e3b4d8b68 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']
Feb 23 09:53:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:53:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:53:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:53:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 do_prune osdmap full prune enabled
Feb 23 09:53:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e91 e91: 6 total, 6 up, 6 in
Feb 23 09:53:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in
Feb 23 09:53:49 np0005626463.localdomain ceph-mon[294160]: osdmap e91: 6 total, 6 up, 6 in
Feb 23 09:53:49 np0005626463.localdomain ceph-mon[294160]: pgmap v90: 177 pgs: 177 active+clean; 225 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 640 KiB/s rd, 4.3 MiB/s wr, 181 op/s
Feb 23 09:53:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:53:49 np0005626463.localdomain podman[308736]: 2026-02-23 09:53:49.904977948 +0000 UTC m=+0.082573642 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:53:49 np0005626463.localdomain podman[308736]: 2026-02-23 09:53:49.917300127 +0000 UTC m=+0.094895811 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:53:49 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:53:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e91 do_prune osdmap full prune enabled
Feb 23 09:53:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 e92: 6 total, 6 up, 6 in
Feb 23 09:53:50 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in
Feb 23 09:53:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:50.772 265541 INFO neutron.agent.linux.ip_lib [None req-8a2a38ed-38fa-49f4-b9b5-b7bd9ebf4b88 - - - - - -] Device tap4cbf3d42-6e cannot be used as it has no MAC address
Feb 23 09:53:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:50.833 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:50 np0005626463.localdomain kernel: device tap4cbf3d42-6e entered promiscuous mode
Feb 23 09:53:50 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840430.8404] manager: (tap4cbf3d42-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Feb 23 09:53:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:50.840 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:50Z|00083|binding|INFO|Claiming lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 for this chassis.
Feb 23 09:53:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:50Z|00084|binding|INFO|4cbf3d42-6ec8-4d67-8923-c66e0247fcd0: Claiming unknown
Feb 23 09:53:50 np0005626463.localdomain systemd-udevd[308770]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:53:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:50.849 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e1135ba2724a69b072bbda0ea8476c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c5c75f-3246-4a64-87cf-649ab7e0f2d0, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=4cbf3d42-6ec8-4d67-8923-c66e0247fcd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:53:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:50.851 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 in datapath c4367d4b-271d-4a28-a878-d77074456171 bound to our chassis
Feb 23 09:53:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:50.853 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4367d4b-271d-4a28-a878-d77074456171 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:53:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:53:50.853 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8756a610-6bdf-41f3-aa62-a078d450e9d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:50Z|00085|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 ovn-installed in OVS
Feb 23 09:53:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:53:50Z|00086|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 up in Southbound
Feb 23 09:53:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:50.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device
Feb 23 09:53:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:50.908 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:50.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:51.594 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:51 np0005626463.localdomain ceph-mon[294160]: osdmap e92: 6 total, 6 up, 6 in
Feb 23 09:53:51 np0005626463.localdomain ceph-mon[294160]: pgmap v92: 177 pgs: 177 active+clean; 225 MiB data, 851 MiB used, 41 GiB / 42 GiB avail; 766 KiB/s rd, 4.3 MiB/s wr, 295 op/s
Feb 23 09:53:51 np0005626463.localdomain podman[308841]: 
Feb 23 09:53:51 np0005626463.localdomain podman[308841]: 2026-02-23 09:53:51.801183483 +0000 UTC m=+0.089448293 container create c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 09:53:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope.
Feb 23 09:53:51 np0005626463.localdomain podman[308841]: 2026-02-23 09:53:51.760235518 +0000 UTC m=+0.048500358 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:53:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:53:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e813bd4a46c7971615ac88ffd7ed7c4cab04c48395cbaf4f846b5efffe82a9e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:53:51 np0005626463.localdomain podman[308841]: 2026-02-23 09:53:51.881078763 +0000 UTC m=+0.169343563 container init c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 09:53:51 np0005626463.localdomain podman[308841]: 2026-02-23 09:53:51.889601294 +0000 UTC m=+0.177866094 container start c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 09:53:51 np0005626463.localdomain dnsmasq[308859]: started, version 2.85 cachesize 150
Feb 23 09:53:51 np0005626463.localdomain dnsmasq[308859]: DNS service limited to local subnets
Feb 23 09:53:51 np0005626463.localdomain dnsmasq[308859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:53:51 np0005626463.localdomain dnsmasq[308859]: warning: no upstream servers configured
Feb 23 09:53:51 np0005626463.localdomain dnsmasq-dhcp[308859]: DHCP, static leases only on 19.80.0.0, lease time 1d
Feb 23 09:53:51 np0005626463.localdomain dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 0 addresses
Feb 23 09:53:51 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host
Feb 23 09:53:51 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts
Feb 23 09:53:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:52.073 265541 INFO neutron.agent.dhcp.agent [None req-590e9331-7138-4f5c-a26c-74331c2db684 - - - - - -] DHCP configuration for ports {'c580c9b8-a35b-42fb-bda8-24401f2a22e1'} is completed
Feb 23 09:53:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:52.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:52 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:53:52.569 2 INFO neutron.agent.securitygroups_rpc [None req-92b63054-7ec2-4ccb-bee7-2b93ea819111 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']
Feb 23 09:53:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:52.619 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282935cdc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282935c850>], id=1fc7da92-c93a-4191-b374-5aef0705e0ce, ip_allocation=immediate, mac_address=fa:16:3e:77:6d:80, name=tempest-subport-1525156145, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:48Z, description=, dns_domain=, id=c4367d4b-271d-4a28-a878-d77074456171, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-631042675, port_security_enabled=True, project_id=b5e1135ba2724a69b072bbda0ea8476c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63828, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=419, status=ACTIVE, subnets=['7dfbfbc0-d83e-4729-b852-8f17e8a182f9'], tags=[], tenant_id=b5e1135ba2724a69b072bbda0ea8476c, updated_at=2026-02-23T09:53:49Z, vlan_transparent=None, network_id=c4367d4b-271d-4a28-a878-d77074456171, port_security_enabled=True, project_id=b5e1135ba2724a69b072bbda0ea8476c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5e2da0ff-f592-42de-9188-06e3b0bca61b'], standard_attr_id=454, status=DOWN, tags=[], tenant_id=b5e1135ba2724a69b072bbda0ea8476c, updated_at=2026-02-23T09:53:52Z on network c4367d4b-271d-4a28-a878-d77074456171
Feb 23 09:53:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 do_prune osdmap full prune enabled
Feb 23 09:53:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e93 e93: 6 total, 6 up, 6 in
Feb 23 09:53:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in
Feb 23 09:53:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:53:52 np0005626463.localdomain systemd[1]: tmp-crun.2CaR5E.mount: Deactivated successfully.
Feb 23 09:53:52 np0005626463.localdomain podman[308877]: 2026-02-23 09:53:52.8559166 +0000 UTC m=+0.071465263 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:53:52 np0005626463.localdomain dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 1 addresses
Feb 23 09:53:52 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host
Feb 23 09:53:52 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts
Feb 23 09:53:52 np0005626463.localdomain podman[308888]: 2026-02-23 09:53:52.941800282 +0000 UTC m=+0.110643863 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.expose-services=, vcs-type=git)
Feb 23 09:53:52 np0005626463.localdomain podman[308888]: 2026-02-23 09:53:52.965219721 +0000 UTC m=+0.134063332 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Feb 23 09:53:52 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:53:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:53:53.148 265541 INFO neutron.agent.dhcp.agent [None req-e19dc4ce-495b-42fc-a50f-199680036686 - - - - - -] DHCP configuration for ports {'1fc7da92-c93a-4191-b374-5aef0705e0ce'} is completed
Feb 23 09:53:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e93 do_prune osdmap full prune enabled
Feb 23 09:53:53 np0005626463.localdomain ceph-mon[294160]: osdmap e93: 6 total, 6 up, 6 in
Feb 23 09:53:53 np0005626463.localdomain ceph-mon[294160]: pgmap v94: 177 pgs: 177 active+clean; 225 MiB data, 851 MiB used, 41 GiB / 42 GiB avail; 126 KiB/s rd, 54 KiB/s wr, 113 op/s
Feb 23 09:53:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 e94: 6 total, 6 up, 6 in
Feb 23 09:53:53 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in
Feb 23 09:53:53 np0005626463.localdomain systemd[1]: tmp-crun.cZQya0.mount: Deactivated successfully.
Feb 23 09:53:54 np0005626463.localdomain ceph-mon[294160]: osdmap e94: 6 total, 6 up, 6 in
Feb 23 09:53:55 np0005626463.localdomain ceph-mon[294160]: pgmap v96: 177 pgs: 177 active+clean; 225 MiB data, 851 MiB used, 41 GiB / 42 GiB avail; 126 KiB/s rd, 54 KiB/s wr, 113 op/s
Feb 23 09:53:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:53:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 do_prune osdmap full prune enabled
Feb 23 09:53:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 e95: 6 total, 6 up, 6 in
Feb 23 09:53:56 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in
Feb 23 09:53:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:56.607 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:57.094 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:57.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:53:57.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:53:57 np0005626463.localdomain ceph-mon[294160]: osdmap e95: 6 total, 6 up, 6 in
Feb 23 09:53:57 np0005626463.localdomain ceph-mon[294160]: pgmap v98: 177 pgs: 177 active+clean; 304 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.7 MiB/s wr, 141 op/s
Feb 23 09:53:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:53:58.479 2 INFO neutron.agent.securitygroups_rpc [req-eb0b5adf-b7bd-4216-9e46-c2d60917a5c7 req-02734a2f-bd2e-4435-92b8-64f041df35e7 c511c0d31bd1497ea63920bacbc29b16 bba12cc9382b485789a88c5fc615cc96 - - default default] Security group rule updated ['9bb94a7a-3596-41a3-a016-4ce3c9b7d984']
Feb 23 09:53:58 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3937151935' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:58 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2906370524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:53:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:53:58.754 2 INFO neutron.agent.securitygroups_rpc [req-a40b7300-b557-4abd-b7da-db42c2c5294c req-3cf01c79-11ae-4df5-84a2-12d00520d629 c511c0d31bd1497ea63920bacbc29b16 bba12cc9382b485789a88c5fc615cc96 - - default default] Security group rule updated ['9bb94a7a-3596-41a3-a016-4ce3c9b7d984']
Feb 23 09:53:59 np0005626463.localdomain sshd[308919]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:54:00 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1200777599' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:00 np0005626463.localdomain ceph-mon[294160]: pgmap v99: 177 pgs: 177 active+clean; 304 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 7.4 MiB/s rd, 7.4 MiB/s wr, 134 op/s
Feb 23 09:54:00 np0005626463.localdomain sshd[308919]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 do_prune osdmap full prune enabled
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 e96: 6 total, 6 up, 6 in
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in
Feb 23 09:54:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:01.799 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: pgmap v100: 177 pgs: 177 active+clean; 289 MiB data, 965 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 121 op/s
Feb 23 09:54:01 np0005626463.localdomain ceph-mon[294160]: osdmap e96: 6 total, 6 up, 6 in
Feb 23 09:54:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:02.367 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:02.748 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:02 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2877882451' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:54:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:54:03 np0005626463.localdomain ceph-mon[294160]: pgmap v102: 177 pgs: 177 active+clean; 224 MiB data, 880 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 147 op/s
Feb 23 09:54:03 np0005626463.localdomain podman[308921]: 2026-02-23 09:54:03.9289202 +0000 UTC m=+0.093753335 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216)
Feb 23 09:54:04 np0005626463.localdomain podman[308921]: 2026-02-23 09:54:04.015897656 +0000 UTC m=+0.180730751 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 09:54:04 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:54:04 np0005626463.localdomain podman[308922]: 2026-02-23 09:54:04.017180346 +0000 UTC m=+0.180525036 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:54:04 np0005626463.localdomain podman[308922]: 2026-02-23 09:54:04.09784369 +0000 UTC m=+0.261188370 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:54:04 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:54:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3081989457' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:54:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3081989457' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:54:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1891678803' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/954476728' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/758634170' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:05 np0005626463.localdomain ceph-mon[294160]: pgmap v103: 177 pgs: 177 active+clean; 224 MiB data, 880 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.6 KiB/s wr, 38 op/s
Feb 23 09:54:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3807949205' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:54:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 do_prune osdmap full prune enabled
Feb 23 09:54:06 np0005626463.localdomain podman[308970]: 2026-02-23 09:54:06.926502562 +0000 UTC m=+0.088849104 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:54:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 e97: 6 total, 6 up, 6 in
Feb 23 09:54:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in
Feb 23 09:54:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:06.951 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:06 np0005626463.localdomain podman[308970]: 2026-02-23 09:54:06.969589103 +0000 UTC m=+0.131935695 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:06 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:54:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:07.371 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:07 np0005626463.localdomain ceph-mon[294160]: osdmap e97: 6 total, 6 up, 6 in
Feb 23 09:54:07 np0005626463.localdomain ceph-mon[294160]: pgmap v105: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 8.5 MiB/s wr, 221 op/s
Feb 23 09:54:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:08Z|00087|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:08.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:08Z|00088|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:08.370 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:08.602 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:08.603 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:08.604 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:54:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:54:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:54:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:54:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160727 "" "Go-http-client/1.1"
Feb 23 09:54:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:54:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19762 "" "Go-http-client/1.1"
Feb 23 09:54:10 np0005626463.localdomain ceph-mon[294160]: pgmap v106: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 8.5 MiB/s wr, 205 op/s
Feb 23 09:54:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:54:10 np0005626463.localdomain podman[308989]: 2026-02-23 09:54:10.915438237 +0000 UTC m=+0.088640908 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 09:54:10 np0005626463.localdomain podman[308989]: 2026-02-23 09:54:10.925119614 +0000 UTC m=+0.098322305 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:10 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:54:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:11Z|00089|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:11.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:12 np0005626463.localdomain ceph-mon[294160]: pgmap v107: 177 pgs: 177 active+clean; 306 MiB data, 989 MiB used, 41 GiB / 42 GiB avail; 8.1 MiB/s rd, 7.3 MiB/s wr, 202 op/s
Feb 23 09:54:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:12.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:12.414 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:54:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:54:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:54:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:54:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:54:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:54:14 np0005626463.localdomain ceph-mon[294160]: pgmap v108: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 9.1 MiB/s rd, 6.8 MiB/s wr, 260 op/s
Feb 23 09:54:14 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:14.607 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:54:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:15.190 265541 INFO neutron.agent.linux.ip_lib [None req-16e3220b-2424-4022-a903-b5a2cd0d9f75 - - - - - -] Device tapc54c90dc-59 cannot be used as it has no MAC address
Feb 23 09:54:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:15.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:15 np0005626463.localdomain kernel: device tapc54c90dc-59 entered promiscuous mode
Feb 23 09:54:15 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840455.2241] manager: (tapc54c90dc-59): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Feb 23 09:54:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:15.223 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:15Z|00090|binding|INFO|Claiming lport c54c90dc-59eb-4ba6-a441-5146f8224a2f for this chassis.
Feb 23 09:54:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:15Z|00091|binding|INFO|c54c90dc-59eb-4ba6-a441-5146f8224a2f: Claiming unknown
Feb 23 09:54:15 np0005626463.localdomain systemd-udevd[309017]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:54:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:15.237 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5e1c9f6e8d451fa766e1133da7c78c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3abb40-d939-4e85-874a-e1c4e0aed9c9, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=c54c90dc-59eb-4ba6-a441-5146f8224a2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:15.240 163572 INFO neutron.agent.ovn.metadata.agent [-] Port c54c90dc-59eb-4ba6-a441-5146f8224a2f in datapath 10e2e9cc-29ed-4970-84df-64c996e76871 bound to our chassis
Feb 23 09:54:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:15.242 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10e2e9cc-29ed-4970-84df-64c996e76871 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:54:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:15.243 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[85147674-c299-47ad-b38c-f2c2e39e26fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:15Z|00092|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f ovn-installed in OVS
Feb 23 09:54:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:15Z|00093|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f up in Southbound
Feb 23 09:54:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:15.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapc54c90dc-59: No such device
Feb 23 09:54:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:15.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:15.330 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:15 np0005626463.localdomain sudo[309048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:54:15 np0005626463.localdomain sudo[309048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:54:15 np0005626463.localdomain sudo[309048]: pam_unix(sudo:session): session closed for user root
Feb 23 09:54:15 np0005626463.localdomain sudo[309070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:54:15 np0005626463.localdomain sudo[309070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 9.1 MiB/s rd, 6.8 MiB/s wr, 260 op/s
Feb 23 09:54:16 np0005626463.localdomain sudo[309070]: pam_unix(sudo:session): session closed for user root
Feb 23 09:54:16 np0005626463.localdomain podman[309153]: 
Feb 23 09:54:16 np0005626463.localdomain podman[309153]: 2026-02-23 09:54:16.323243002 +0000 UTC m=+0.081427147 container create bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:54:16 np0005626463.localdomain systemd[1]: Started libpod-conmon-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope.
Feb 23 09:54:16 np0005626463.localdomain podman[309153]: 2026-02-23 09:54:16.281593625 +0000 UTC m=+0.039777840 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:54:16 np0005626463.localdomain systemd[1]: tmp-crun.ZPQzZI.mount: Deactivated successfully.
Feb 23 09:54:16 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:54:16 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edab4fa058864b90b5cf7f18b174b68d3de95b3e3a1dd659b3050d106daf7c15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:54:16 np0005626463.localdomain podman[309153]: 2026-02-23 09:54:16.434830033 +0000 UTC m=+0.193014188 container init bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:54:16 np0005626463.localdomain podman[309153]: 2026-02-23 09:54:16.443704816 +0000 UTC m=+0.201888951 container start bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:54:16 np0005626463.localdomain sudo[309171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:54:16 np0005626463.localdomain dnsmasq[309191]: started, version 2.85 cachesize 150
Feb 23 09:54:16 np0005626463.localdomain dnsmasq[309191]: DNS service limited to local subnets
Feb 23 09:54:16 np0005626463.localdomain dnsmasq[309191]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:54:16 np0005626463.localdomain dnsmasq[309191]: warning: no upstream servers configured
Feb 23 09:54:16 np0005626463.localdomain dnsmasq-dhcp[309191]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:54:16 np0005626463.localdomain dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 0 addresses
Feb 23 09:54:16 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host
Feb 23 09:54:16 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts
Feb 23 09:54:16 np0005626463.localdomain sudo[309171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:54:16 np0005626463.localdomain sudo[309171]: pam_unix(sudo:session): session closed for user root
Feb 23 09:54:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:16.678 265541 INFO neutron.agent.dhcp.agent [None req-4c359eed-cf97-4fcd-b878-458d218022d9 - - - - - -] DHCP configuration for ports {'ebac5c61-7d67-4897-8f42-37105858c5d4'} is completed
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 do_prune osdmap full prune enabled
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 e98: 6 total, 6 up, 6 in
Feb 23 09:54:16 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in
Feb 23 09:54:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:54:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:54:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:54:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:54:17 np0005626463.localdomain ceph-mon[294160]: osdmap e98: 6 total, 6 up, 6 in
Feb 23 09:54:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:17.417 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:17.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:18 np0005626463.localdomain ceph-mon[294160]: pgmap v111: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 29 KiB/s wr, 237 op/s
Feb 23 09:54:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2975642150' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:19 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3781968202' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:19 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1652408262' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:19.480 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:19 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:19.547 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:19Z, description=, device_id=5cb19bc2-a554-4277-a22e-8a81f247e688, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345d90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345a30>], id=f934b998-7206-4f75-a3ae-c758bf173f59, ip_allocation=immediate, mac_address=fa:16:3e:18:8c:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:13Z, description=, dns_domain=, id=10e2e9cc-29ed-4970-84df-64c996e76871, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-984997043-network, port_security_enabled=True, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['da529488-ae4a-474a-81ed-8a85a5e66a50'], tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:14Z, vlan_transparent=None, network_id=10e2e9cc-29ed-4970-84df-64c996e76871, port_security_enabled=False, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=596, status=DOWN, tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:19Z on network 10e2e9cc-29ed-4970-84df-64c996e76871
Feb 23 09:54:19 np0005626463.localdomain systemd[1]: tmp-crun.m9n4gp.mount: Deactivated successfully.
Feb 23 09:54:19 np0005626463.localdomain dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 1 addresses
Feb 23 09:54:19 np0005626463.localdomain podman[309210]: 2026-02-23 09:54:19.787894712 +0000 UTC m=+0.068931674 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:54:19 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host
Feb 23 09:54:19 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts
Feb 23 09:54:20 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:20.153 265541 INFO neutron.agent.dhcp.agent [None req-522110c1-716e-49e2-a0ca-2b71dcbf71a3 - - - - - -] DHCP configuration for ports {'f934b998-7206-4f75-a3ae-c758bf173f59'} is completed
Feb 23 09:54:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:54:20 np0005626463.localdomain ceph-mon[294160]: pgmap v112: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 29 KiB/s wr, 237 op/s
Feb 23 09:54:20 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3989029809' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:20 np0005626463.localdomain sshd[309232]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:54:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.593 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.594 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.594 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:54:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:54:20 np0005626463.localdomain podman[309233]: 2026-02-23 09:54:20.919543306 +0000 UTC m=+0.083722107 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:54:20 np0005626463.localdomain podman[309233]: 2026-02-23 09:54:20.927996596 +0000 UTC m=+0.092175357 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.935 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.936 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.936 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:54:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:20.937 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:54:20 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:54:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:54:21 np0005626463.localdomain ceph-mon[294160]: pgmap v113: 177 pgs: 177 active+clean; 273 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 20 KiB/s wr, 202 op/s
Feb 23 09:54:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:21 np0005626463.localdomain sshd[309232]: Connection closed by authenticating user root 185.156.73.233 port 49974 [preauth]
Feb 23 09:54:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 do_prune osdmap full prune enabled
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.445 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e99 e99: 6 total, 6 up, 6 in
Feb 23 09:54:22 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.525 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.551 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.551 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.552 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:54:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:22.566 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:22 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:22.784 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:19Z, description=, device_id=5cb19bc2-a554-4277-a22e-8a81f247e688, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282935ccd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282935cbe0>], id=f934b998-7206-4f75-a3ae-c758bf173f59, ip_allocation=immediate, mac_address=fa:16:3e:18:8c:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:13Z, description=, dns_domain=, id=10e2e9cc-29ed-4970-84df-64c996e76871, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-984997043-network, port_security_enabled=True, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['da529488-ae4a-474a-81ed-8a85a5e66a50'], tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:14Z, vlan_transparent=None, network_id=10e2e9cc-29ed-4970-84df-64c996e76871, port_security_enabled=False, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=596, status=DOWN, tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:19Z on network 10e2e9cc-29ed-4970-84df-64c996e76871
Feb 23 09:54:23 np0005626463.localdomain podman[309271]: 2026-02-23 09:54:23.024214693 +0000 UTC m=+0.060981641 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 09:54:23 np0005626463.localdomain dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 1 addresses
Feb 23 09:54:23 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host
Feb 23 09:54:23 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts
Feb 23 09:54:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:54:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:23.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:23.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:23.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:23 np0005626463.localdomain podman[309286]: 2026-02-23 09:54:23.108306821 +0000 UTC m=+0.058804784 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347)
Feb 23 09:54:23 np0005626463.localdomain podman[309286]: 2026-02-23 09:54:23.120649389 +0000 UTC m=+0.071147392 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:54:23 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:54:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:23.272 265541 INFO neutron.agent.dhcp.agent [None req-12ed2ead-5d81-43c4-a1b1-5026044fa5a1 - - - - - -] DHCP configuration for ports {'f934b998-7206-4f75-a3ae-c758bf173f59'} is completed
Feb 23 09:54:23 np0005626463.localdomain ceph-mon[294160]: osdmap e99: 6 total, 6 up, 6 in
Feb 23 09:54:23 np0005626463.localdomain ceph-mon[294160]: pgmap v115: 177 pgs: 177 active+clean; 275 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 250 KiB/s wr, 174 op/s
Feb 23 09:54:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:24.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e99 do_prune osdmap full prune enabled
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 e100: 6 total, 6 up, 6 in
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.834914) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464834960, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1229, "num_deletes": 254, "total_data_size": 1170312, "memory_usage": 1192328, "flush_reason": "Manual Compaction"}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464844836, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1135721, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24033, "largest_seqno": 25261, "table_properties": {"data_size": 1130174, "index_size": 2954, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 13048, "raw_average_key_size": 21, "raw_value_size": 1118642, "raw_average_value_size": 1827, "num_data_blocks": 125, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840391, "oldest_key_time": 1771840391, "file_creation_time": 1771840464, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10005 microseconds, and 4168 cpu microseconds.
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.844916) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1135721 bytes OK
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.844984) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847755) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847778) EVENT_LOG_v1 {"time_micros": 1771840464847772, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847799) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1164659, prev total WAL file size 1164659, number of live WAL files 2.
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.848608) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1109KB)], [42(17MB)]
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464848653, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 19642850, "oldest_snapshot_seqno": -1}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12169 keys, 17877121 bytes, temperature: kUnknown
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464979818, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 17877121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17808406, "index_size": 37190, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 327407, "raw_average_key_size": 26, "raw_value_size": 17601662, "raw_average_value_size": 1446, "num_data_blocks": 1414, "num_entries": 12169, "num_filter_entries": 12169, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840464, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.980173) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 17877121 bytes
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.982510) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.6 rd, 136.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.6 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(33.0) write-amplify(15.7) OK, records in: 12702, records dropped: 533 output_compression: NoCompression
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.982550) EVENT_LOG_v1 {"time_micros": 1771840464982532, "job": 24, "event": "compaction_finished", "compaction_time_micros": 131319, "compaction_time_cpu_micros": 46147, "output_level": 6, "num_output_files": 1, "total_output_size": 17877121, "num_input_records": 12702, "num_output_records": 12169, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464982856, "job": 24, "event": "table_file_deletion", "file_number": 44}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464985692, "job": 24, "event": "table_file_deletion", "file_number": 42}
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.848520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:24 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:54:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.143 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.155 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.159 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:54:26 np0005626463.localdomain ceph-mon[294160]: osdmap e100: 6 total, 6 up, 6 in
Feb 23 09:54:26 np0005626463.localdomain ceph-mon[294160]: pgmap v117: 177 pgs: 177 active+clean; 275 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 233 KiB/s wr, 23 op/s
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:54:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:54:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2552967718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.754 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.594s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.823 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.824 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:54:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.969 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11357MB free_disk=41.63432312011719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:54:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:54:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2552967718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.190 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.190 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.191 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.360 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.530 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.533 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.535 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.607 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:54:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:27.647 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:54:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 do_prune osdmap full prune enabled
Feb 23 09:54:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 e101: 6 total, 6 up, 6 in
Feb 23 09:54:28 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in
Feb 23 09:54:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:28.914 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:54:28 np0005626463.localdomain ceph-mon[294160]: pgmap v118: 177 pgs: 177 active+clean; 379 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 8.9 MiB/s wr, 202 op/s
Feb 23 09:54:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:54:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2426814409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.293 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.378s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.299 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.341 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.380 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.380 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.381 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.382 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:54:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:29.417 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:54:29 np0005626463.localdomain ceph-mon[294160]: osdmap e101: 6 total, 6 up, 6 in
Feb 23 09:54:29 np0005626463.localdomain ceph-mon[294160]: pgmap v120: 177 pgs: 177 active+clean; 379 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.6 MiB/s rd, 11 MiB/s wr, 217 op/s
Feb 23 09:54:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2426814409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:30.329 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2410368303' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 do_prune osdmap full prune enabled
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 e102: 6 total, 6 up, 6 in
Feb 23 09:54:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in
Feb 23 09:54:32 np0005626463.localdomain ceph-mon[294160]: pgmap v121: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.3 MiB/s rd, 8.8 MiB/s wr, 222 op/s
Feb 23 09:54:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2410368303' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:32 np0005626463.localdomain ceph-mon[294160]: osdmap e102: 6 total, 6 up, 6 in
Feb 23 09:54:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:32.532 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:32.541 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4269194252' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:34 np0005626463.localdomain ceph-mon[294160]: pgmap v123: 177 pgs: 177 active+clean; 306 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 8.8 MiB/s wr, 261 op/s
Feb 23 09:54:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/764192646' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:54:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:54:34 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:34Z|00094|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:34.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:34 np0005626463.localdomain systemd[1]: tmp-crun.SMcMgf.mount: Deactivated successfully.
Feb 23 09:54:34 np0005626463.localdomain podman[309355]: 2026-02-23 09:54:34.933666465 +0000 UTC m=+0.102172414 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 09:54:34 np0005626463.localdomain systemd[1]: tmp-crun.EIDCKP.mount: Deactivated successfully.
Feb 23 09:54:35 np0005626463.localdomain podman[309356]: 2026-02-23 09:54:35.004433955 +0000 UTC m=+0.168407194 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:54:35 np0005626463.localdomain podman[309355]: 2026-02-23 09:54:35.005461007 +0000 UTC m=+0.173966956 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:54:35 np0005626463.localdomain podman[309356]: 2026-02-23 09:54:35.041382217 +0000 UTC m=+0.205355446 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:54:35 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:54:35 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:54:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2116820524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:35.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:36 np0005626463.localdomain ceph-mon[294160]: pgmap v124: 177 pgs: 177 active+clean; 306 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 105 KiB/s wr, 82 op/s
Feb 23 09:54:36 np0005626463.localdomain podman[309421]: 2026-02-23 09:54:36.8125633 +0000 UTC m=+0.060835177 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:54:36 np0005626463.localdomain dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 0 addresses
Feb 23 09:54:36 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host
Feb 23 09:54:36 np0005626463.localdomain dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts
Feb 23 09:54:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:37.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:37 np0005626463.localdomain kernel: device tapc54c90dc-59 left promiscuous mode
Feb 23 09:54:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:37Z|00095|binding|INFO|Releasing lport c54c90dc-59eb-4ba6-a441-5146f8224a2f from this chassis (sb_readonly=0)
Feb 23 09:54:37 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:37Z|00096|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f down in Southbound
Feb 23 09:54:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:37.331 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5e1c9f6e8d451fa766e1133da7c78c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3abb40-d939-4e85-874a-e1c4e0aed9c9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=c54c90dc-59eb-4ba6-a441-5146f8224a2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:37.333 163572 INFO neutron.agent.ovn.metadata.agent [-] Port c54c90dc-59eb-4ba6-a441-5146f8224a2f in datapath 10e2e9cc-29ed-4970-84df-64c996e76871 unbound from our chassis
Feb 23 09:54:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:37.337 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 10e2e9cc-29ed-4970-84df-64c996e76871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:54:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:37.338 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3aee6415-6e71-482c-9029-ef0377888538]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:37.344 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:37.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:37.542 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:54:37 np0005626463.localdomain ceph-mon[294160]: pgmap v125: 177 pgs: 177 active+clean; 285 MiB data, 958 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 3.9 MiB/s wr, 223 op/s
Feb 23 09:54:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1785721779' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1512085890' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:54:37 np0005626463.localdomain systemd[1]: tmp-crun.1UZ3VI.mount: Deactivated successfully.
Feb 23 09:54:37 np0005626463.localdomain podman[309445]: 2026-02-23 09:54:37.915252006 +0000 UTC m=+0.086904045 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 23 09:54:37 np0005626463.localdomain podman[309445]: 2026-02-23 09:54:37.926081128 +0000 UTC m=+0.097733187 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:54:37 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:54:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 do_prune osdmap full prune enabled
Feb 23 09:54:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 e103: 6 total, 6 up, 6 in
Feb 23 09:54:39 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in
Feb 23 09:54:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:54:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:54:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:54:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162551 "" "Go-http-client/1.1"
Feb 23 09:54:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:54:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20246 "" "Go-http-client/1.1"
Feb 23 09:54:40 np0005626463.localdomain ceph-mon[294160]: osdmap e103: 6 total, 6 up, 6 in
Feb 23 09:54:40 np0005626463.localdomain ceph-mon[294160]: pgmap v127: 177 pgs: 177 active+clean; 285 MiB data, 958 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 4.0 MiB/s wr, 185 op/s
Feb 23 09:54:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:41.077 2 INFO neutron.agent.securitygroups_rpc [None req-ff2971e4-9240-4c6a-b550-c220da174a71 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']
Feb 23 09:54:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:41.367 2 INFO neutron.agent.securitygroups_rpc [None req-ff2971e4-9240-4c6a-b550-c220da174a71 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']
Feb 23 09:54:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:54:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:41 np0005626463.localdomain podman[309465]: 2026-02-23 09:54:41.905653868 +0000 UTC m=+0.081898523 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent)
Feb 23 09:54:41 np0005626463.localdomain podman[309465]: 2026-02-23 09:54:41.916286373 +0000 UTC m=+0.092531068 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 23 09:54:41 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:54:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:42.131 2 INFO neutron.agent.securitygroups_rpc [None req-2742a8e1-f348-43b4-9fb2-dd2072837cd2 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']
Feb 23 09:54:42 np0005626463.localdomain ceph-mon[294160]: pgmap v128: 177 pgs: 177 active+clean; 230 MiB data, 892 MiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 5.1 MiB/s wr, 197 op/s
Feb 23 09:54:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1822049600' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:54:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:42.229 2 INFO neutron.agent.securitygroups_rpc [None req-e6e1d23b-7f3f-4c9f-825c-f305c0d37186 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']
Feb 23 09:54:42 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:42Z|00097|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:42 np0005626463.localdomain podman[309501]: 2026-02-23 09:54:42.4791633 +0000 UTC m=+0.061856057 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:54:42 np0005626463.localdomain dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 0 addresses
Feb 23 09:54:42 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host
Feb 23 09:54:42 np0005626463.localdomain dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts
Feb 23 09:54:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:42.483 2 INFO neutron.agent.securitygroups_rpc [None req-ca0adc89-f8c7-402b-bd8c-8b5b257e86b1 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']
Feb 23 09:54:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:42.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:42.538 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:42.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:42.960 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:43 np0005626463.localdomain dnsmasq[308859]: exiting on receipt of SIGTERM
Feb 23 09:54:43 np0005626463.localdomain systemd[1]: tmp-crun.rU82g4.mount: Deactivated successfully.
Feb 23 09:54:43 np0005626463.localdomain podman[309538]: 2026-02-23 09:54:43.020200298 +0000 UTC m=+0.068877433 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:54:43 np0005626463.localdomain systemd[1]: libpod-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope: Deactivated successfully.
Feb 23 09:54:43 np0005626463.localdomain podman[309552]: 2026-02-23 09:54:43.090129451 +0000 UTC m=+0.058084441 container died c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:54:43 np0005626463.localdomain podman[309552]: 2026-02-23 09:54:43.12725574 +0000 UTC m=+0.095210690 container cleanup c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 09:54:43 np0005626463.localdomain systemd[1]: libpod-conmon-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope: Deactivated successfully.
Feb 23 09:54:43 np0005626463.localdomain podman[309556]: 2026-02-23 09:54:43.176596462 +0000 UTC m=+0.134045881 container remove c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:54:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:43Z|00098|binding|INFO|Releasing lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 from this chassis (sb_readonly=0)
Feb 23 09:54:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:43.189 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:43Z|00099|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 down in Southbound
Feb 23 09:54:43 np0005626463.localdomain kernel: device tap4cbf3d42-6e left promiscuous mode
Feb 23 09:54:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:43.197 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 20a12874-8371-4400-bfd5-f2688e2d3266 with type ""
Feb 23 09:54:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:43.199 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e1135ba2724a69b072bbda0ea8476c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c5c75f-3246-4a64-87cf-649ab7e0f2d0, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=4cbf3d42-6ec8-4d67-8923-c66e0247fcd0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:43.201 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 in datapath c4367d4b-271d-4a28-a878-d77074456171 unbound from our chassis
Feb 23 09:54:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:43.204 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4367d4b-271d-4a28-a878-d77074456171, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:54:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:43.205 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[affde97d-20ec-4fb0-be08-322cd094ef68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:43.212 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.238 265541 INFO neutron.agent.dhcp.agent [None req-14252414-a13b-4f2d-867e-ef5a0d9933fd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:43 np0005626463.localdomain dnsmasq[309191]: exiting on receipt of SIGTERM
Feb 23 09:54:43 np0005626463.localdomain podman[309599]: 2026-02-23 09:54:43.311596992 +0000 UTC m=+0.058833155 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:43 np0005626463.localdomain systemd[1]: libpod-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope: Deactivated successfully.
Feb 23 09:54:43 np0005626463.localdomain podman[309614]: 2026-02-23 09:54:43.372051225 +0000 UTC m=+0.043855436 container died bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 09:54:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:54:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:54:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:54:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:54:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:54:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:54:43 np0005626463.localdomain podman[309614]: 2026-02-23 09:54:43.421639685 +0000 UTC m=+0.093443856 container remove bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:54:43 np0005626463.localdomain systemd[1]: libpod-conmon-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope: Deactivated successfully.
Feb 23 09:54:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.488 265541 INFO neutron.agent.dhcp.agent [None req-d7b77519-726d-47a8-927b-110421a7141c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.576 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.853 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:43Z|00100|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:43.987 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-edab4fa058864b90b5cf7f18b174b68d3de95b3e3a1dd659b3050d106daf7c15-merged.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c-userdata-shm.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d10e2e9cc\x2d29ed\x2d4970\x2d84df\x2d64c996e76871.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-e813bd4a46c7971615ac88ffd7ed7c4cab04c48395cbaf4f846b5efffe82a9e8-merged.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84-userdata-shm.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dc4367d4b\x2d271d\x2d4a28\x2da878\x2dd77074456171.mount: Deactivated successfully.
Feb 23 09:54:44 np0005626463.localdomain ceph-mon[294160]: pgmap v129: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 6.9 MiB/s rd, 4.7 MiB/s wr, 265 op/s
Feb 23 09:54:44 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:44.637 2 INFO neutron.agent.securitygroups_rpc [None req-c06d8773-2479-4f88-85a7-f04d29c76a1d 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']
Feb 23 09:54:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 do_prune osdmap full prune enabled
Feb 23 09:54:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e104 e104: 6 total, 6 up, 6 in
Feb 23 09:54:45 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: pgmap v130: 177 pgs: 177 active+clean; 146 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 6.9 MiB/s rd, 4.7 MiB/s wr, 264 op/s
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: osdmap e104: 6 total, 6 up, 6 in
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e104 do_prune osdmap full prune enabled
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 e105: 6 total, 6 up, 6 in
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in
Feb 23 09:54:46 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:46Z|00101|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:46.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 do_prune osdmap full prune enabled
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 e106: 6 total, 6 up, 6 in
Feb 23 09:54:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in
Feb 23 09:54:47 np0005626463.localdomain ceph-mon[294160]: osdmap e105: 6 total, 6 up, 6 in
Feb 23 09:54:47 np0005626463.localdomain ceph-mon[294160]: osdmap e106: 6 total, 6 up, 6 in
Feb 23 09:54:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:47.550 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:47.551 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:54:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:47.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:47 np0005626463.localdomain sshd[309641]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:54:48 np0005626463.localdomain ceph-mon[294160]: pgmap v134: 177 pgs: 177 active+clean; 145 MiB data, 738 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 9.8 KiB/s wr, 193 op/s
Feb 23 09:54:48 np0005626463.localdomain sshd[309641]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:54:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:48.312 265541 INFO neutron.agent.linux.ip_lib [None req-fce4ed71-03d3-41c5-bb15-fbd2116eca99 - - - - - -] Device tap1d4d42e8-30 cannot be used as it has no MAC address
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.339 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain kernel: device tap1d4d42e8-30 entered promiscuous mode
Feb 23 09:54:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:48Z|00102|binding|INFO|Claiming lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b for this chassis.
Feb 23 09:54:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:48Z|00103|binding|INFO|1d4d42e8-30fc-45f0-86d9-320b12bff55b: Claiming unknown
Feb 23 09:54:48 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840488.3495] manager: (tap1d4d42e8-30): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.349 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain systemd-udevd[309652]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.363 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fbe870428324feda18014285ef9eb40', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=042074d1-ff67-46dd-af64-fc750025a9c1, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1d4d42e8-30fc-45f0-86d9-320b12bff55b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.365 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1d4d42e8-30fc-45f0-86d9-320b12bff55b in datapath 02ee7c5a-2db8-434a-a435-61821ceb4b9b bound to our chassis
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.367 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02ee7c5a-2db8-434a-a435-61821ceb4b9b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.368 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb28397-80ec-4031-ae24-75137e90f386]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:48Z|00104|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b ovn-installed in OVS
Feb 23 09:54:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:48Z|00105|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b up in Southbound
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.373 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.391 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.438 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:48.464 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:54:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:48.557 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:54:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:49Z|00106|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:49.293 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:49 np0005626463.localdomain podman[309723]: 
Feb 23 09:54:49 np0005626463.localdomain podman[309723]: 2026-02-23 09:54:49.321670591 +0000 UTC m=+0.100597285 container create c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 09:54:49 np0005626463.localdomain systemd[1]: Started libpod-conmon-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope.
Feb 23 09:54:49 np0005626463.localdomain podman[309723]: 2026-02-23 09:54:49.282243562 +0000 UTC m=+0.061170296 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:54:49 np0005626463.localdomain systemd[1]: tmp-crun.yZv2oU.mount: Deactivated successfully.
Feb 23 09:54:49 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:54:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/590d7f40eabf0d6dfb27cf9871c5b44da87730638b147c979467440410e9b196/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:54:49 np0005626463.localdomain podman[309723]: 2026-02-23 09:54:49.431539059 +0000 UTC m=+0.210465753 container init c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:54:49 np0005626463.localdomain podman[309723]: 2026-02-23 09:54:49.440742081 +0000 UTC m=+0.219668775 container start c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: started, version 2.85 cachesize 150
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: DNS service limited to local subnets
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: warning: no upstream servers configured
Feb 23 09:54:49 np0005626463.localdomain dnsmasq-dhcp[309742]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/addn_hosts - 0 addresses
Feb 23 09:54:49 np0005626463.localdomain dnsmasq-dhcp[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/host
Feb 23 09:54:49 np0005626463.localdomain dnsmasq-dhcp[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/opts
Feb 23 09:54:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:49.652 265541 INFO neutron.agent.dhcp.agent [None req-0d6a0230-d3ea-40b2-9bcd-7410244dd29e - - - - - -] DHCP configuration for ports {'db544d9d-8b97-42b8-8a54-79f7ab5f8b46'} is completed
Feb 23 09:54:49 np0005626463.localdomain dnsmasq[309742]: exiting on receipt of SIGTERM
Feb 23 09:54:49 np0005626463.localdomain systemd[1]: libpod-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope: Deactivated successfully.
Feb 23 09:54:49 np0005626463.localdomain podman[309761]: 2026-02-23 09:54:49.827346434 +0000 UTC m=+0.071534034 container kill c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:54:49 np0005626463.localdomain podman[309774]: 2026-02-23 09:54:49.905703126 +0000 UTC m=+0.067690596 container died c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:54:49 np0005626463.localdomain podman[309774]: 2026-02-23 09:54:49.936947395 +0000 UTC m=+0.098934835 container cleanup c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:49 np0005626463.localdomain systemd[1]: libpod-conmon-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope: Deactivated successfully.
Feb 23 09:54:49 np0005626463.localdomain podman[309781]: 2026-02-23 09:54:49.988100913 +0000 UTC m=+0.136069673 container remove c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:54:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:50Z|00107|binding|INFO|Releasing lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b from this chassis (sb_readonly=0)
Feb 23 09:54:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:50Z|00108|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b down in Southbound
Feb 23 09:54:50 np0005626463.localdomain kernel: device tap1d4d42e8-30 left promiscuous mode
Feb 23 09:54:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:50.045 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:50.051 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fbe870428324feda18014285ef9eb40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=042074d1-ff67-46dd-af64-fc750025a9c1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1d4d42e8-30fc-45f0-86d9-320b12bff55b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:50.053 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1d4d42e8-30fc-45f0-86d9-320b12bff55b in datapath 02ee7c5a-2db8-434a-a435-61821ceb4b9b unbound from our chassis
Feb 23 09:54:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:50.055 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02ee7c5a-2db8-434a-a435-61821ceb4b9b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:54:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:50.056 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[641328a5-d474-45cf-acbd-5977ea6b07b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:50.066 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:50 np0005626463.localdomain ceph-mon[294160]: pgmap v135: 177 pgs: 177 active+clean; 145 MiB data, 738 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 5.3 KiB/s wr, 55 op/s
Feb 23 09:54:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:50.261 265541 INFO neutron.agent.dhcp.agent [None req-6c50aab4-86c9-4e9e-89f0-3f62c8618500 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-590d7f40eabf0d6dfb27cf9871c5b44da87730638b147c979467440410e9b196-merged.mount: Deactivated successfully.
Feb 23 09:54:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a-userdata-shm.mount: Deactivated successfully.
Feb 23 09:54:50 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d02ee7c5a\x2d2db8\x2d434a\x2da435\x2d61821ceb4b9b.mount: Deactivated successfully.
Feb 23 09:54:50 np0005626463.localdomain podman[309822]: 2026-02-23 09:54:50.889410866 +0000 UTC m=+0.061537297 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:54:50 np0005626463.localdomain dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 0 addresses
Feb 23 09:54:50 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host
Feb 23 09:54:50 np0005626463.localdomain dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00109|binding|INFO|Releasing lport ff39110f-d5ab-4f4c-b656-11139ee6c196 from this chassis (sb_readonly=0)
Feb 23 09:54:51 np0005626463.localdomain kernel: device tapff39110f-d5 left promiscuous mode
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00110|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 down in Southbound
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.070 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02917a1d904f4889b9e244e1ebfc57ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fe629c8-1dc0-4c84-9b5b-6b0d444ac4ee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=ff39110f-d5ab-4f4c-b656-11139ee6c196) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:51.071 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.072 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ff39110f-d5ab-4f4c-b656-11139ee6c196 in datapath b3238cd9-9eb9-4ae1-bb2b-833536c18deb unbound from our chassis
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.075 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3238cd9-9eb9-4ae1-bb2b-833536c18deb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.077 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[6f199ec1-b5f7-41fe-9db7-7392f052d521]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.083 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.085 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00111|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:51.388 265541 INFO neutron.agent.linux.ip_lib [None req-1658a744-993a-437c-9d55-5b8ee12ce82d - - - - - -] Device tap8f0c5ba8-2f cannot be used as it has no MAC address
Feb 23 09:54:51 np0005626463.localdomain podman[309847]: 2026-02-23 09:54:51.419173718 +0000 UTC m=+0.099312896 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.427 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain kernel: device tap8f0c5ba8-2f entered promiscuous mode
Feb 23 09:54:51 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840491.4347] manager: (tap8f0c5ba8-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00112|binding|INFO|Claiming lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 for this chassis.
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00113|binding|INFO|8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431: Claiming unknown
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.436 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain systemd-udevd[309877]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:54:51 np0005626463.localdomain podman[309847]: 2026-02-23 09:54:51.448802716 +0000 UTC m=+0.128941904 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.449 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0bd74f028104d8eab41537070541f77', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d55ec2a4-c84a-4c50-ac81-d4cad2b77be1, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.451 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 in datapath c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f bound to our chassis
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.453 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:54:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:51.454 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[80f7f2ad-79a9-4004-942c-4a37ecec1d85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:54:51 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00114|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 ovn-installed in OVS
Feb 23 09:54:51 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:51Z|00115|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 up in Southbound
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.471 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.518 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:51.551 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 do_prune osdmap full prune enabled
Feb 23 09:54:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 e107: 6 total, 6 up, 6 in
Feb 23 09:54:51 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in
Feb 23 09:54:52 np0005626463.localdomain ceph-mon[294160]: pgmap v136: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 5.5 KiB/s wr, 73 op/s
Feb 23 09:54:52 np0005626463.localdomain ceph-mon[294160]: osdmap e107: 6 total, 6 up, 6 in
Feb 23 09:54:52 np0005626463.localdomain podman[309949]: 
Feb 23 09:54:52 np0005626463.localdomain podman[309949]: 2026-02-23 09:54:52.39583052 +0000 UTC m=+0.087282337 container create caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:54:52 np0005626463.localdomain systemd[1]: Started libpod-conmon-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope.
Feb 23 09:54:52 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:54:52 np0005626463.localdomain podman[309949]: 2026-02-23 09:54:52.353705779 +0000 UTC m=+0.045157636 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:54:52 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166f830536de571ce740513568578247182f4fa7b08cb2a3bc0f31bbb6bd274f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:54:52 np0005626463.localdomain podman[309949]: 2026-02-23 09:54:52.468806917 +0000 UTC m=+0.160258734 container init caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 09:54:52 np0005626463.localdomain systemd[1]: tmp-crun.Hmzlmo.mount: Deactivated successfully.
Feb 23 09:54:52 np0005626463.localdomain podman[309949]: 2026-02-23 09:54:52.480120614 +0000 UTC m=+0.171572431 container start caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:54:52 np0005626463.localdomain dnsmasq[309968]: started, version 2.85 cachesize 150
Feb 23 09:54:52 np0005626463.localdomain dnsmasq[309968]: DNS service limited to local subnets
Feb 23 09:54:52 np0005626463.localdomain dnsmasq[309968]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:54:52 np0005626463.localdomain dnsmasq[309968]: warning: no upstream servers configured
Feb 23 09:54:52 np0005626463.localdomain dnsmasq-dhcp[309968]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:54:52 np0005626463.localdomain dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 0 addresses
Feb 23 09:54:52 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host
Feb 23 09:54:52 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts
Feb 23 09:54:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:52.616 265541 INFO neutron.agent.dhcp.agent [None req-efc9d436-1b4f-4d05-bf40-cc111ee43471 - - - - - -] DHCP configuration for ports {'6d4c17c0-28d2-4957-9681-dd047b0ebba7'} is completed
Feb 23 09:54:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:52.623 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:53 np0005626463.localdomain ceph-mon[294160]: pgmap v138: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 2.8 KiB/s wr, 30 op/s
Feb 23 09:54:53 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:54:53Z|00116|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:54:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:53.594 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:54:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:53.856 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:53 np0005626463.localdomain systemd[1]: tmp-crun.I9QnIP.mount: Deactivated successfully.
Feb 23 09:54:53 np0005626463.localdomain podman[309969]: 2026-02-23 09:54:53.925662563 +0000 UTC m=+0.094600101 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 09:54:53 np0005626463.localdomain podman[309969]: 2026-02-23 09:54:53.942264782 +0000 UTC m=+0.111202310 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, version=9.7, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 23 09:54:53 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:54:55 np0005626463.localdomain dnsmasq[308657]: exiting on receipt of SIGTERM
Feb 23 09:54:55 np0005626463.localdomain systemd[1]: libpod-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope: Deactivated successfully.
Feb 23 09:54:55 np0005626463.localdomain podman[310006]: 2026-02-23 09:54:55.206078908 +0000 UTC m=+0.072333658 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 09:54:55 np0005626463.localdomain podman[310020]: 2026-02-23 09:54:55.289457514 +0000 UTC m=+0.069174391 container died b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:54:55 np0005626463.localdomain systemd[1]: tmp-crun.ybdHWf.mount: Deactivated successfully.
Feb 23 09:54:55 np0005626463.localdomain podman[310020]: 2026-02-23 09:54:55.321694063 +0000 UTC m=+0.101410900 container cleanup b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:54:55 np0005626463.localdomain systemd[1]: libpod-conmon-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope: Deactivated successfully.
Feb 23 09:54:55 np0005626463.localdomain podman[310027]: 2026-02-23 09:54:55.368172417 +0000 UTC m=+0.135640509 container remove b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:54:55 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.411 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:54:55.554 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:54:55 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.618 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:54Z, description=, device_id=daed35e2-7c42-4b10-a372-e586a9bec4ff, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345d90>], id=126e04eb-5ef6-4c95-9995-1dc19782e90c, ip_allocation=immediate, mac_address=fa:16:3e:63:df:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:48Z, description=, dns_domain=, id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1249780383-network, port_security_enabled=True, project_id=a0bd74f028104d8eab41537070541f77, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3813, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=781, status=ACTIVE, subnets=['aadca0bd-2aca-4d46-8acc-f53c76863e44'], tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:50Z, vlan_transparent=None, network_id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, port_security_enabled=False, project_id=a0bd74f028104d8eab41537070541f77, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:55Z on network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f
Feb 23 09:54:55 np0005626463.localdomain dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 1 addresses
Feb 23 09:54:55 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host
Feb 23 09:54:55 np0005626463.localdomain podman[310065]: 2026-02-23 09:54:55.831667708 +0000 UTC m=+0.067609874 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:54:55 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts
Feb 23 09:54:55 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.920 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:54:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:56.083 265541 INFO neutron.agent.dhcp.agent [None req-63645ab9-f215-4269-8159-fc0bce210a6d - - - - - -] DHCP configuration for ports {'126e04eb-5ef6-4c95-9995-1dc19782e90c'} is completed
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.145 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.160 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.163 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceph-mon[294160]: pgmap v139: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 124 B/s wr, 12 op/s
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12496de6-bffa-490d-9daa-577b047a4908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.147733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b53aa722-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'cda72ab4ba8ab08db2f762b09e5426289ec7ad89bab3dc00aed808fc0e15c793'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.147733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b53ad38c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'b6112b8d6c0c3ea860beda9105ceaea4ba816277876dfbd7c833c1a1082f2415'}]}, 'timestamp': '2026-02-23 09:54:56.163990', '_unique_id': '1c3b53c35afa49d8b7333883141c5658'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ae017d-6b70-4be5-bc11-44544e819e78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.170172', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b53c70d4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '5670fa7c8d92f9d157fed3f1841d4dcac63d832a7bc16e1ba38d2a9998a86a9f'}]}, 'timestamp': '2026-02-23 09:54:56.174613', '_unique_id': '79523850fab94a4e8fe40d3efa46a99d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd195aafe-e084-4f7e-bc5d-0d5961c5fec8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.177491', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b53cf81a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '81777c6ebc3fddea48fb59f495d51097f517cf1c47158326c1f8f1a172b5e124'}]}, 'timestamp': '2026-02-23 09:54:56.178047', '_unique_id': '91b08addf6fb4e629633d6af0122ad27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-aecf83ec51d273f51dfbd2c3054b8e6328f27606ae999dc013ec2a626bc3eb75-merged.mount: Deactivated successfully.
Feb 23 09:54:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35-userdata-shm.mount: Deactivated successfully.
Feb 23 09:54:56 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2db3238cd9\x2d9eb9\x2d4ae1\x2dbb2b\x2d833536c18deb.mount: Deactivated successfully.
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16b753ca-6f35-461d-a827-1b6557019594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.182100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b543a61a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '13277080cab96402787c272fa8fee4abe207fd8e06c5629140fc521baeaff3ad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.182100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b543c08c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '21511fc604bca0fa5777744f4a8e7b8e813e8b5e895a9797f3dba1e232828536'}]}, 'timestamp': '2026-02-23 09:54:56.222469', '_unique_id': 'e1b85ae5534b471a8ffc8abb11797f98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77e135b3-bc9e-44c3-9be6-3921fff16088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.226563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54474b4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '40e210079174950b247d807a1dc723c73c8553f0a7ac690671f09ef5fdb2598a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.226563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5449110-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '0d1b04927a16acb530460c676f3b7565dc3b1eb4c3c9c4169318180c15f7fa21'}]}, 'timestamp': '2026-02-23 09:54:56.227803', '_unique_id': '9bd90f0796be4d4caacd47ac56846ae7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ac3eee6-ac8f-49fc-ab7b-92187ea44cff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.231115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b5452a80-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '14bfc4b730b4c8b6079b7aca36a053a3d012c2a994cf26de21055e861944b4c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.231115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5454650-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '19d312a854ecfc6f4259374753ecb2779106770ce18dd56c34d0f61a9552a3b4'}]}, 'timestamp': '2026-02-23 09:54:56.232525', '_unique_id': '7e54064a770b40048cc807b794acb0f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.235 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46fc66c1-ba89-4d12-92c9-d7e7fcc6c884', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.236394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b545f938-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '72c2776b43ffed0221e4b45b101d26821bcddb353ca248677023bb7ea7607ccd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.236394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5460fea-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': 'c7ffac90749452e6893d6bfed09c86790d4357383ce41c342b9b5b5b4203e007'}]}, 'timestamp': '2026-02-23 09:54:56.237569', '_unique_id': '3abfa13945324cbfb36b588e9126675d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:56.250 2 INFO neutron.agent.securitygroups_rpc [None req-e4e33ab4-a498-48c3-b62b-d466306b0b2c 8c67fb6133284335807155391776f7a4 35e3e6665f014caf91b19ef9e685a75a - - default default] Security group member updated ['18ad37ac-3bf6-435c-949b-384a2e1dc20f']
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 13130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c4011c7-feb8-423f-9760-4d67a6aeff40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13130000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:54:56.241176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b54954b6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.447862772, 'message_signature': '118f84f09689002f96a8ae7d593b2c013e8eb5c76e2b947c11208b9142596fa4'}]}, 'timestamp': '2026-02-23 09:54:56.259086', '_unique_id': 'b0d681469f7743f58711def3e0fb6c1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36adebc1-d2ac-4ed6-8c90-46a64224a7a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.262327', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b549e93a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '93f6057e414dc7617c452e34c871e3ac5c098bb8afcec36213a0da66c97813e3'}]}, 'timestamp': '2026-02-23 09:54:56.262840', '_unique_id': '9da49dd6bbc048cc9f4bc387dcc72ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aba727f1-df44-400c-9d11-6b956bf11125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:54:56.265600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b54a70ee-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.447862772, 'message_signature': '85ec1429ed383a4ea2a2b08606da369b5288b8f489c861ee003227f9abae42ba'}]}, 'timestamp': '2026-02-23 09:54:56.266360', '_unique_id': 'bf3fe55251de4a9ba042ec9781f3b1f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0589513c-2fcb-4869-b120-086e3022de2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.269357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54b009a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '67e873917b392a7a1a61eb0d8f53ca7506c03e26d4b03494f17b891b0df42894'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.269357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54b180a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'dbce286e6d8e909fc0b4fa386df6db3375f1d3f5dd6935c9f00d7621681ce45e'}]}, 'timestamp': '2026-02-23 09:54:56.270551', '_unique_id': 'e5de1998b89e4621bf58c1546dd655d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9b2896e-6d3e-40db-b90d-cd8626b5c124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.273246', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54b98e8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '2c2cbf8e0058465d9fb0d5b46325d35a3336658630fdee3c41ad979faa2046d7'}]}, 'timestamp': '2026-02-23 09:54:56.274048', '_unique_id': 'ec88f56c5b8c44f08a9029429688f7ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.276 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4773f4f0-abe9-41a3-9fc4-e0de44ae2415', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.276600', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54c22ea-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'b3bd28d7839920e1556199d5d60846c2a9a8a96d58e023792dada96b9fb3b404'}]}, 'timestamp': '2026-02-23 09:54:56.277511', '_unique_id': 'b7f5d05ecdab44acaa20bf6870ffc11a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.279 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.280 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66d6317e-9b6c-4f39-8525-9a2394d2c5d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.279894', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54c948c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '931303c450476eac1cbc4b51b7f33005b8eed205737ad8a525b78d11ff6109e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.279894', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54c9f86-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': 'f4752b9e065c74376873824fbe7bceb27bc0e38e560372d2273933f47f559dc5'}]}, 'timestamp': '2026-02-23 09:54:56.280565', '_unique_id': '2562b358cc054694a4beffecb9332865'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.282 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eedfa6c8-c180-4a85-bc25-49269f86e16f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.282512', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54cf9c2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'f550b42406adf324968bf67556ee49f8932a604469252de7a3093fad74149aff'}]}, 'timestamp': '2026-02-23 09:54:56.282815', '_unique_id': '609dfe2043f94029977274d536413ff1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2604a197-9685-420d-ba90-5d14560955e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.284213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54d3e8c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '52c407b65b1f92a509df30304557a3b7ec2a1bff992f59c195e09b7509ed95ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.284213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54d4e2c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '7c62fde312d422d98c166af4399fa2bbd8fbf11f2a37e747a00b9baaaf4621ca'}]}, 'timestamp': '2026-02-23 09:54:56.285042', '_unique_id': '806378f9789a4615980e99486fd006cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.286 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f4e8f8-75b7-4060-b997-a0e51a59606f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.286961', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54da76e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '4bcf5ab3dc81e72d1bc2077c1dd4d96080402374899c9712028bbb0bc45376c0'}]}, 'timestamp': '2026-02-23 09:54:56.287262', '_unique_id': '04df3ed4419b4f82984b5daf09a2e29d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.288 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.289 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d26078d-87ad-442c-ada0-7760a3f82e7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.288825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54df228-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '9c4d4f0182c375aed6528bcf2e3ca89aa77ada34c7865b7f20e61419e85c308d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.288825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54dfe80-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '3ff48f39c7c70094bad5c000092fa557d7b3a0424b0e7fe893c426fa49b1765b'}]}, 'timestamp': '2026-02-23 09:54:56.289557', '_unique_id': 'aa6cbc5a164e4588b3d3903931c96114'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.291 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.291 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad6553d-9c61-4512-a01c-551023e7ee8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.291353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54e5628-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'd6d07421900bd9a5e15c3b50a4d8a8f626919650189a8922cd09a148f83bb2c1'}]}, 'timestamp': '2026-02-23 09:54:56.291751', '_unique_id': 'bceff6e266824e3881b1b0af72ef335a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e80d240-436b-4a87-bf84-8364b205642b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.293591', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54ea9f2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '3ce1d0ba98ef7ea51bdea3c48cc7dfd17bb7680b3c6409dc9693f70ec9c9dd06'}]}, 'timestamp': '2026-02-23 09:54:56.293916', '_unique_id': '4c29eaa94b8f46e591a966952cd1b748'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.295 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52495aba-bd53-4ede-b021-c8dab20721b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.295492', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54ef43e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'fc67d181bf5d9f1f239fd13eb72a8599e24c51255801c9e41f3f6afaee62c9ca'}]}, 'timestamp': '2026-02-23 09:54:56.295780', '_unique_id': '7bff1a9cc8ea422b9bb355f896375118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:54:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:54:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:56.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:56 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:54:56.733 2 INFO neutron.agent.securitygroups_rpc [None req-5f241d63-4aea-41aa-b22a-dca2e6055a49 8c67fb6133284335807155391776f7a4 35e3e6665f014caf91b19ef9e685a75a - - default default] Security group member updated ['18ad37ac-3bf6-435c-949b-384a2e1dc20f']
Feb 23 09:54:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:54:57 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:57.393 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:54Z, description=, device_id=daed35e2-7c42-4b10-a372-e586a9bec4ff, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28293a56a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28293a57c0>], id=126e04eb-5ef6-4c95-9995-1dc19782e90c, ip_allocation=immediate, mac_address=fa:16:3e:63:df:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:48Z, description=, dns_domain=, id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1249780383-network, port_security_enabled=True, project_id=a0bd74f028104d8eab41537070541f77, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3813, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=781, status=ACTIVE, subnets=['aadca0bd-2aca-4d46-8acc-f53c76863e44'], tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:50Z, vlan_transparent=None, network_id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, port_security_enabled=False, project_id=a0bd74f028104d8eab41537070541f77, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:55Z on network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f
Feb 23 09:54:57 np0005626463.localdomain dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 1 addresses
Feb 23 09:54:57 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host
Feb 23 09:54:57 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts
Feb 23 09:54:57 np0005626463.localdomain podman[310100]: 2026-02-23 09:54:57.626016559 +0000 UTC m=+0.064801278 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 09:54:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:54:57.629 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:54:57 np0005626463.localdomain ceph-mon[294160]: pgmap v140: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s rd, 102 B/s wr, 10 op/s
Feb 23 09:54:57 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:54:57.958 265541 INFO neutron.agent.dhcp.agent [None req-e608f138-ffbf-4dc8-8bd2-b1786562fd53 - - - - - -] DHCP configuration for ports {'126e04eb-5ef6-4c95-9995-1dc19782e90c'} is completed
Feb 23 09:55:00 np0005626463.localdomain ceph-mon[294160]: pgmap v141: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s rd, 102 B/s wr, 10 op/s
Feb 23 09:55:01 np0005626463.localdomain ceph-mon[294160]: pgmap v142: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:55:01 np0005626463.localdomain dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 0 addresses
Feb 23 09:55:01 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host
Feb 23 09:55:01 np0005626463.localdomain podman[310139]: 2026-02-23 09:55:01.564821196 +0000 UTC m=+0.038147080 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:55:01 np0005626463.localdomain dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts
Feb 23 09:55:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:01Z|00117|binding|INFO|Releasing lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 from this chassis (sb_readonly=0)
Feb 23 09:55:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:01.715 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:01 np0005626463.localdomain kernel: device tap8f0c5ba8-2f left promiscuous mode
Feb 23 09:55:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:01Z|00118|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 down in Southbound
Feb 23 09:55:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:01.726 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0bd74f028104d8eab41537070541f77', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d55ec2a4-c84a-4c50-ac81-d4cad2b77be1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:01.728 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 in datapath c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f unbound from our chassis
Feb 23 09:55:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:01.732 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:55:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:01.733 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5d03b4c4-fbe6-4bfa-bcdc-40f3d378507c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:01.739 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:02.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:03.013 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:03 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:03Z|00119|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:03.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:03.907 2 INFO neutron.agent.securitygroups_rpc [None req-d42fa55b-3158-4a2c-8987-32ad0eaad7e9 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group rule updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']
Feb 23 09:55:04 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:04.028 2 INFO neutron.agent.securitygroups_rpc [None req-e1f7edff-cd43-4ce1-b8ac-263a3462cef1 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group rule updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']
Feb 23 09:55:04 np0005626463.localdomain ceph-mon[294160]: pgmap v143: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:55:04 np0005626463.localdomain systemd[1]: tmp-crun.7Nw3bz.mount: Deactivated successfully.
Feb 23 09:55:04 np0005626463.localdomain dnsmasq[309968]: exiting on receipt of SIGTERM
Feb 23 09:55:04 np0005626463.localdomain podman[310180]: 2026-02-23 09:55:04.451471308 +0000 UTC m=+0.069259001 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 23 09:55:04 np0005626463.localdomain systemd[1]: libpod-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope: Deactivated successfully.
Feb 23 09:55:04 np0005626463.localdomain podman[310192]: 2026-02-23 09:55:04.519550421 +0000 UTC m=+0.058085126 container died caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:04 np0005626463.localdomain podman[310192]: 2026-02-23 09:55:04.564642264 +0000 UTC m=+0.103176929 container cleanup caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:55:04 np0005626463.localdomain systemd[1]: libpod-conmon-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope: Deactivated successfully.
Feb 23 09:55:04 np0005626463.localdomain podman[310199]: 2026-02-23 09:55:04.592927948 +0000 UTC m=+0.117744529 container remove caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:55:04 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:04.853 265541 INFO neutron.agent.dhcp.agent [None req-ff3cd298-8062-4235-a6fd-9e5d9742d7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:04 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:04.916 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:55:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3895599750' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:55:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3895599750' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:55:05 np0005626463.localdomain ceph-mon[294160]: pgmap v144: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:55:05 np0005626463.localdomain podman[310218]: 2026-02-23 09:55:05.417269506 +0000 UTC m=+0.087335319 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:55:05 np0005626463.localdomain podman[310218]: 2026-02-23 09:55:05.429604728 +0000 UTC m=+0.099670501 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: tmp-crun.WhzQWp.mount: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-166f830536de571ce740513568578247182f4fa7b08cb2a3bc0f31bbb6bd274f-merged.mount: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81-userdata-shm.mount: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dc5dfb0c2\x2d3b4f\x2d4a80\x2dad77\x2d54809a86fa2f.mount: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain podman[310217]: 2026-02-23 09:55:05.519961819 +0000 UTC m=+0.194062706 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:05 np0005626463.localdomain podman[310217]: 2026-02-23 09:55:05.614299334 +0000 UTC m=+0.288400241 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:05 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:55:05 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:05.768 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:06.237 2 INFO neutron.agent.securitygroups_rpc [None req-ba0f42b0-8ba1-470a-9030-cc088e382d98 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']
Feb 23 09:55:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:06.384 2 INFO neutron.agent.securitygroups_rpc [None req-ba0f42b0-8ba1-470a-9030-cc088e382d98 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']
Feb 23 09:55:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:07.516 2 INFO neutron.agent.securitygroups_rpc [None req-d0d1b9f7-4125-4c2c-a5f0-a4480a3b4692 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']
Feb 23 09:55:07 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:07.564 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:07.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:07.666 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:07.877 2 INFO neutron.agent.securitygroups_rpc [None req-295aa794-6290-4e56-bed7-db18bb8fb456 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']
Feb 23 09:55:07 np0005626463.localdomain ceph-mon[294160]: pgmap v145: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:55:07 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:07.913 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:08.761 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:55:08 np0005626463.localdomain podman[310267]: 2026-02-23 09:55:08.91146452 +0000 UTC m=+0.083355726 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:55:08 np0005626463.localdomain podman[310267]: 2026-02-23 09:55:08.927172535 +0000 UTC m=+0.099063741 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:55:08 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:55:09 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3228397301' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:55:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:55:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:55:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 09:55:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:55:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18811 "" "Go-http-client/1.1"
Feb 23 09:55:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:09.607 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:10 np0005626463.localdomain ceph-mon[294160]: pgmap v146: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:55:10 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:10.290 2 INFO neutron.agent.securitygroups_rpc [req-54148787-22c3-403e-9c83-5d532e459a95 req-dcd0fbb9-2ed1-469d-8be3-13ca7cbeb9c8 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group member updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']
Feb 23 09:55:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:12.085 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:12 np0005626463.localdomain ceph-mon[294160]: pgmap v147: 177 pgs: 177 active+clean; 163 MiB data, 780 MiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 1010 KiB/s wr, 2 op/s
Feb 23 09:55:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:12.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:55:12 np0005626463.localdomain podman[310288]: 2026-02-23 09:55:12.903651482 +0000 UTC m=+0.073210122 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 09:55:12 np0005626463.localdomain podman[310288]: 2026-02-23 09:55:12.938176859 +0000 UTC m=+0.107735509 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:55:12 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:55:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:55:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:55:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:55:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:55:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:55:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:55:14 np0005626463.localdomain ceph-mon[294160]: pgmap v148: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 23 09:55:14 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:14Z|00120|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:14.540 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:15Z|00121|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:15.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3506502717' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:55:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:16.180 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:16 np0005626463.localdomain ceph-mon[294160]: pgmap v149: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 23 09:55:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/496496546' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:55:16 np0005626463.localdomain sudo[310306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:55:16 np0005626463.localdomain sudo[310306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:55:16 np0005626463.localdomain sudo[310306]: pam_unix(sudo:session): session closed for user root
Feb 23 09:55:16 np0005626463.localdomain sudo[310324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:55:16 np0005626463.localdomain sudo[310324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:55:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:17 np0005626463.localdomain sudo[310324]: pam_unix(sudo:session): session closed for user root
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:55:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:17.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:17 np0005626463.localdomain sudo[310373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:55:17 np0005626463.localdomain sudo[310373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:55:17 np0005626463.localdomain sudo[310373]: pam_unix(sudo:session): session closed for user root
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: pgmap v150: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 1.8 MiB/s wr, 36 op/s
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:55:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:55:19 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:19.939 265541 INFO neutron.agent.linux.ip_lib [None req-17e43156-c4f0-496a-902f-89d922724e50 - - - - - -] Device tapfea38170-06 cannot be used as it has no MAC address
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain kernel: device tapfea38170-06 entered promiscuous mode
Feb 23 09:55:20 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840520.0168] manager: (tapfea38170-06): new Generic device (/org/freedesktop/NetworkManager/Devices/23)
Feb 23 09:55:20 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:20Z|00122|binding|INFO|Claiming lport fea38170-0626-427b-8a36-b82b8e008ab6 for this chassis.
Feb 23 09:55:20 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:20Z|00123|binding|INFO|fea38170-0626-427b-8a36-b82b8e008ab6: Claiming unknown
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain systemd-udevd[310401]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.025 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:20.043 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5632ff1108264def864ca9b5473cb716', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cca3f636-88c2-4a23-a28f-aa045d27b076, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=fea38170-0626-427b-8a36-b82b8e008ab6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:20 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:20.045 163572 INFO neutron.agent.ovn.metadata.agent [-] Port fea38170-0626-427b-8a36-b82b8e008ab6 in datapath 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 bound to our chassis
Feb 23 09:55:20 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:20.047 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:20 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:20.048 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd7a47a-49cb-431d-bfd3-1c208c57d7c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:20Z|00124|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 ovn-installed in OVS
Feb 23 09:55:20 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:20Z|00125|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 up in Southbound
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.058 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapfea38170-06: No such device
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.115 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:20 np0005626463.localdomain ceph-mon[294160]: pgmap v151: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 1.8 MiB/s wr, 36 op/s
Feb 23 09:55:20 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/970130502' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:55:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.593 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.593 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.594 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:55:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:20.594 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:55:20 np0005626463.localdomain podman[310472]: 
Feb 23 09:55:20 np0005626463.localdomain podman[310472]: 2026-02-23 09:55:20.945994665 +0000 UTC m=+0.094701417 container create 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 09:55:20 np0005626463.localdomain systemd[1]: Started libpod-conmon-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope.
Feb 23 09:55:20 np0005626463.localdomain podman[310472]: 2026-02-23 09:55:20.897638401 +0000 UTC m=+0.046345172 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:55:21 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:55:21 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/930a766cf5e4b8a84c31286e57382a4788939504b12f986029e04d96b0f3a126/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:55:21 np0005626463.localdomain podman[310472]: 2026-02-23 09:55:21.015027938 +0000 UTC m=+0.163734689 container init 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:21 np0005626463.localdomain podman[310472]: 2026-02-23 09:55:21.023816569 +0000 UTC m=+0.172523310 container start 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 09:55:21 np0005626463.localdomain dnsmasq[310490]: started, version 2.85 cachesize 150
Feb 23 09:55:21 np0005626463.localdomain dnsmasq[310490]: DNS service limited to local subnets
Feb 23 09:55:21 np0005626463.localdomain dnsmasq[310490]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:55:21 np0005626463.localdomain dnsmasq[310490]: warning: no upstream servers configured
Feb 23 09:55:21 np0005626463.localdomain dnsmasq-dhcp[310490]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:55:21 np0005626463.localdomain dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 0 addresses
Feb 23 09:55:21 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host
Feb 23 09:55:21 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts
Feb 23 09:55:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4058803820' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2954119314' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:55:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3601251603' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:21.204 265541 INFO neutron.agent.dhcp.agent [None req-a77d95f7-8e38-4044-b7ba-f69e17d3490f - - - - - -] DHCP configuration for ports {'a1c7667d-69ec-4b95-8d4d-acc5bfc0d1b5'} is completed
Feb 23 09:55:21 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:55:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:21.842 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:55:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:21.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:55:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:21.860 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:55:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:21 np0005626463.localdomain podman[310491]: 2026-02-23 09:55:21.895722828 +0000 UTC m=+0.070989125 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:55:21 np0005626463.localdomain podman[310491]: 2026-02-23 09:55:21.908361258 +0000 UTC m=+0.083627545 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:55:21 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:55:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:21.926 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:22 np0005626463.localdomain ceph-mon[294160]: pgmap v152: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 1.1 MiB/s rd, 1.8 MiB/s wr, 70 op/s
Feb 23 09:55:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:22.718 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:22 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:22.928 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:23.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:55:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:23.706 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:24.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:24.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:24.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:24 np0005626463.localdomain ceph-mon[294160]: pgmap v153: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 819 KiB/s wr, 97 op/s
Feb 23 09:55:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:24.813 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:24Z, description=, device_id=f4707105-8993-4923-9b08-bd7f3dfa76d5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c05fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c05ca0>], id=96c8333d-b497-441e-9fb8-278da37499dc, ip_allocation=immediate, mac_address=fa:16:3e:e3:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:17Z, description=, dns_domain=, id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-170814225-network, port_security_enabled=True, project_id=5632ff1108264def864ca9b5473cb716, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=955, status=ACTIVE, subnets=['a71d2280-a111-45a5-9765-078a6bc8268f'], tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:18Z, vlan_transparent=None, network_id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, port_security_enabled=False, project_id=5632ff1108264def864ca9b5473cb716, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=998, status=DOWN, tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:24Z on network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652
Feb 23 09:55:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:55:24 np0005626463.localdomain podman[310514]: 2026-02-23 09:55:24.900949735 +0000 UTC m=+0.071526191 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, release=1770267347, version=9.7, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Feb 23 09:55:24 np0005626463.localdomain podman[310514]: 2026-02-23 09:55:24.916242758 +0000 UTC m=+0.086819204 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 09:55:24 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:55:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:25.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:25 np0005626463.localdomain podman[310548]: 2026-02-23 09:55:25.053838149 +0000 UTC m=+0.068927261 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:55:25 np0005626463.localdomain systemd[1]: tmp-crun.lq3rCc.mount: Deactivated successfully.
Feb 23 09:55:25 np0005626463.localdomain dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 1 addresses
Feb 23 09:55:25 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host
Feb 23 09:55:25 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts
Feb 23 09:55:25 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:25.283 265541 INFO neutron.agent.dhcp.agent [None req-17b79acc-a3db-4c16-b59e-6f9ed9a4a168 - - - - - -] DHCP configuration for ports {'96c8333d-b497-441e-9fb8-278da37499dc'} is completed
Feb 23 09:55:25 np0005626463.localdomain ceph-mon[294160]: pgmap v154: 177 pgs: 177 active+clean; 192 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 73 op/s
Feb 23 09:55:25 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:25.884 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:24Z, description=, device_id=f4707105-8993-4923-9b08-bd7f3dfa76d5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c39e50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c73400>], id=96c8333d-b497-441e-9fb8-278da37499dc, ip_allocation=immediate, mac_address=fa:16:3e:e3:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:17Z, description=, dns_domain=, id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-170814225-network, port_security_enabled=True, project_id=5632ff1108264def864ca9b5473cb716, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=955, status=ACTIVE, subnets=['a71d2280-a111-45a5-9765-078a6bc8268f'], tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:18Z, vlan_transparent=None, network_id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, port_security_enabled=False, project_id=5632ff1108264def864ca9b5473cb716, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=998, status=DOWN, tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:24Z on network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652
Feb 23 09:55:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:26 np0005626463.localdomain podman[310585]: 2026-02-23 09:55:26.118085189 +0000 UTC m=+0.068650342 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:55:26 np0005626463.localdomain dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 1 addresses
Feb 23 09:55:26 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host
Feb 23 09:55:26 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts
Feb 23 09:55:26 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:26.813 265541 INFO neutron.agent.dhcp.agent [None req-36e0b5c7-2f90-42f4-a8e4-18f2d34148ce - - - - - -] DHCP configuration for ports {'96c8333d-b497-441e-9fb8-278da37499dc'} is completed
Feb 23 09:55:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.067 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.067 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:55:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:55:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2239828795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.512 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.578 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.579 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.721 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.817 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.818 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11380MB free_disk=41.77389907836914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.819 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.820 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:55:27 np0005626463.localdomain ceph-mon[294160]: pgmap v155: 177 pgs: 177 active+clean; 201 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1009 KiB/s wr, 89 op/s
Feb 23 09:55:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2239828795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.919 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.920 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.921 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:55:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:27.964 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:55:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:55:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1845912676' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.420 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.427 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.444 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.447 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.447 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:55:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:28.601 2 INFO neutron.agent.securitygroups_rpc [None req-9a0eab4a-b1f8-4315-8f9f-526f0b4e43e4 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']
Feb 23 09:55:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:28.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1845912676' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:29.448 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:55:30 np0005626463.localdomain ceph-mon[294160]: pgmap v156: 177 pgs: 177 active+clean; 201 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 995 KiB/s wr, 80 op/s
Feb 23 09:55:30 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:30.654 265541 INFO neutron.agent.linux.ip_lib [None req-c9653e48-282d-47cb-9d54-ee84f65f6f1a - - - - - -] Device tap5e31c1f9-f2 cannot be used as it has no MAC address
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:30 np0005626463.localdomain kernel: device tap5e31c1f9-f2 entered promiscuous mode
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.683 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:30 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840530.6837] manager: (tap5e31c1f9-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Feb 23 09:55:30 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:30Z|00126|binding|INFO|Claiming lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a for this chassis.
Feb 23 09:55:30 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:30Z|00127|binding|INFO|5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a: Claiming unknown
Feb 23 09:55:30 np0005626463.localdomain systemd-udevd[310659]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.694 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0b7f1d9-1471-4000-b583-343082500ed7, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.696 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a in datapath b207e42e-4d3c-43ce-b855-2d1a36797be6 bound to our chassis
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.699 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b207e42e-4d3c-43ce-b855-2d1a36797be6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.701 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3b815d-04a2-4053-9b16-a81e541838b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:30Z|00128|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a ovn-installed in OVS
Feb 23 09:55:30 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:30Z|00129|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a up in Southbound
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.729 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.775 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:30 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:30.776 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:30.798 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:31 np0005626463.localdomain ceph-mon[294160]: pgmap v157: 177 pgs: 177 active+clean; 213 MiB data, 860 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 92 op/s
Feb 23 09:55:31 np0005626463.localdomain podman[310730]: 
Feb 23 09:55:31 np0005626463.localdomain podman[310730]: 2026-02-23 09:55:31.574125426 +0000 UTC m=+0.095762110 container create 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 09:55:31 np0005626463.localdomain systemd[1]: Started libpod-conmon-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope.
Feb 23 09:55:31 np0005626463.localdomain podman[310730]: 2026-02-23 09:55:31.531789568 +0000 UTC m=+0.053426242 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:55:31 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:55:31 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d2c0758da427144c001a7e97c7359bf4b87ae2a86e80ab66c7f916f65db929/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:55:31 np0005626463.localdomain podman[310730]: 2026-02-23 09:55:31.655509711 +0000 UTC m=+0.177146345 container init 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:55:31 np0005626463.localdomain podman[310730]: 2026-02-23 09:55:31.666424318 +0000 UTC m=+0.188060952 container start 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:55:31 np0005626463.localdomain dnsmasq[310749]: started, version 2.85 cachesize 150
Feb 23 09:55:31 np0005626463.localdomain dnsmasq[310749]: DNS service limited to local subnets
Feb 23 09:55:31 np0005626463.localdomain dnsmasq[310749]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:55:31 np0005626463.localdomain dnsmasq[310749]: warning: no upstream servers configured
Feb 23 09:55:31 np0005626463.localdomain dnsmasq-dhcp[310749]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Feb 23 09:55:31 np0005626463.localdomain dnsmasq[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/addn_hosts - 0 addresses
Feb 23 09:55:31 np0005626463.localdomain dnsmasq-dhcp[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/host
Feb 23 09:55:31 np0005626463.localdomain dnsmasq-dhcp[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/opts
Feb 23 09:55:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:32.089 265541 INFO neutron.agent.dhcp.agent [None req-279e7834-41a1-4ec5-859a-2b8e7a4535b7 - - - - - -] DHCP configuration for ports {'24d8afdd-ed5e-4b36-b6ce-17f8eb8c09d3'} is completed
Feb 23 09:55:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:32.759 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:32.778 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:55:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 do_prune osdmap full prune enabled
Feb 23 09:55:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e108 e108: 6 total, 6 up, 6 in
Feb 23 09:55:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in
Feb 23 09:55:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:33.238 265541 INFO neutron.agent.linux.ip_lib [None req-5b78d67a-d67a-4488-96a5-505670629ace - - - - - -] Device tap6025ad38-91 cannot be used as it has no MAC address
Feb 23 09:55:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:33.260 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:33 np0005626463.localdomain kernel: device tap6025ad38-91 entered promiscuous mode
Feb 23 09:55:33 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840533.2670] manager: (tap6025ad38-91): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Feb 23 09:55:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:33.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:33 np0005626463.localdomain systemd-udevd[310662]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:55:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:33Z|00130|binding|INFO|Claiming lport 6025ad38-916c-468c-898d-3a9de80cd0c9 for this chassis.
Feb 23 09:55:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:33Z|00131|binding|INFO|6025ad38-916c-468c-898d-3a9de80cd0c9: Claiming unknown
Feb 23 09:55:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:33.279 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb2ad155-4f5a-4d4b-8819-d0aef1f50516, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6025ad38-916c-468c-898d-3a9de80cd0c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:33.281 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6025ad38-916c-468c-898d-3a9de80cd0c9 in datapath 8523f038-ac71-4b3d-b11f-1dcce416acd1 bound to our chassis
Feb 23 09:55:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:33.284 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8523f038-ac71-4b3d-b11f-1dcce416acd1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:33.285 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[53ac0be8-7757-42c8-a1e6-045c3f4e15fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:33Z|00132|binding|INFO|Setting lport 6025ad38-916c-468c-898d-3a9de80cd0c9 ovn-installed in OVS
Feb 23 09:55:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:33Z|00133|binding|INFO|Setting lport 6025ad38-916c-468c-898d-3a9de80cd0c9 up in Southbound
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:33.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6025ad38-91: No such device
Feb 23 09:55:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:33.345 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:33.377 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:34Z|00134|binding|INFO|Removing iface tap6025ad38-91 ovn-installed in OVS
Feb 23 09:55:34 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:34.039 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8eebb565-f801-41e8-8e69-e55ef03184ba with type ""
Feb 23 09:55:34 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:34Z|00135|binding|INFO|Removing lport 6025ad38-916c-468c-898d-3a9de80cd0c9 ovn-installed in OVS
Feb 23 09:55:34 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:34.075 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb2ad155-4f5a-4d4b-8819-d0aef1f50516, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6025ad38-916c-468c-898d-3a9de80cd0c9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:34.075 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:34.077 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6025ad38-916c-468c-898d-3a9de80cd0c9 in datapath 8523f038-ac71-4b3d-b11f-1dcce416acd1 unbound from our chassis
Feb 23 09:55:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:34.079 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:34.080 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8523f038-ac71-4b3d-b11f-1dcce416acd1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:34 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:34.081 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4b02b6e3-0607-4ae4-afe6-a599187fa219]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:34 np0005626463.localdomain ceph-mon[294160]: osdmap e108: 6 total, 6 up, 6 in
Feb 23 09:55:34 np0005626463.localdomain ceph-mon[294160]: pgmap v159: 177 pgs: 177 active+clean; 225 MiB data, 882 MiB used, 41 GiB / 42 GiB avail; 392 KiB/s rd, 2.6 MiB/s wr, 76 op/s
Feb 23 09:55:34 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:34Z|00136|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:34 np0005626463.localdomain podman[310829]: 
Feb 23 09:55:34 np0005626463.localdomain podman[310829]: 2026-02-23 09:55:34.194685189 +0000 UTC m=+0.089000251 container create a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 09:55:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:34.222 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain systemd[1]: Started libpod-conmon-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope.
Feb 23 09:55:34 np0005626463.localdomain podman[310829]: 2026-02-23 09:55:34.151816205 +0000 UTC m=+0.046131297 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:55:34 np0005626463.localdomain systemd[1]: tmp-crun.3aeZMY.mount: Deactivated successfully.
Feb 23 09:55:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:55:34 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6228693ebd24685afa42cc46e8b1f978c093b0e70be55a48dc426f2ec7a90c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:55:34 np0005626463.localdomain podman[310829]: 2026-02-23 09:55:34.280038616 +0000 UTC m=+0.174353678 container init a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:34 np0005626463.localdomain podman[310829]: 2026-02-23 09:55:34.289083556 +0000 UTC m=+0.183398618 container start a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: started, version 2.85 cachesize 150
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: DNS service limited to local subnets
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: warning: no upstream servers configured
Feb 23 09:55:34 np0005626463.localdomain dnsmasq-dhcp[310845]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/addn_hosts - 0 addresses
Feb 23 09:55:34 np0005626463.localdomain dnsmasq-dhcp[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/host
Feb 23 09:55:34 np0005626463.localdomain dnsmasq-dhcp[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/opts
Feb 23 09:55:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.420 265541 INFO neutron.agent.dhcp.agent [None req-7458c64c-f1da-42ea-8af0-2ba340d0e403 - - - - - -] DHCP configuration for ports {'65f10a64-150b-4c93-8c26-165b199a7803'} is completed
Feb 23 09:55:34 np0005626463.localdomain dnsmasq[310845]: exiting on receipt of SIGTERM
Feb 23 09:55:34 np0005626463.localdomain podman[310862]: 2026-02-23 09:55:34.540665169 +0000 UTC m=+0.065008130 container kill a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:55:34 np0005626463.localdomain systemd[1]: libpod-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope: Deactivated successfully.
Feb 23 09:55:34 np0005626463.localdomain podman[310876]: 2026-02-23 09:55:34.604282404 +0000 UTC m=+0.050326926 container died a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:34 np0005626463.localdomain podman[310876]: 2026-02-23 09:55:34.690136217 +0000 UTC m=+0.136180689 container cleanup a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:55:34 np0005626463.localdomain systemd[1]: libpod-conmon-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope: Deactivated successfully.
Feb 23 09:55:34 np0005626463.localdomain podman[310883]: 2026-02-23 09:55:34.716909283 +0000 UTC m=+0.143806073 container remove a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:34 np0005626463.localdomain kernel: device tap6025ad38-91 left promiscuous mode
Feb 23 09:55:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:34.733 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:34.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.775 265541 INFO neutron.agent.dhcp.agent [None req-0faf63b8-4fac-4327-9992-7e942b5fe495 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.775 265541 INFO neutron.agent.dhcp.agent [None req-0faf63b8-4fac-4327-9992-7e942b5fe495 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e108 do_prune osdmap full prune enabled
Feb 23 09:55:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e109 e109: 6 total, 6 up, 6 in
Feb 23 09:55:35 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b6228693ebd24685afa42cc46e8b1f978c093b0e70be55a48dc426f2ec7a90c0-merged.mount: Deactivated successfully.
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb-userdata-shm.mount: Deactivated successfully.
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d8523f038\x2dac71\x2d4b3d\x2db11f\x2d1dcce416acd1.mount: Deactivated successfully.
Feb 23 09:55:35 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:35.592 2 INFO neutron.agent.securitygroups_rpc [None req-0cb29720-58ee-4ab0-99f8-69e7c954667c 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:55:35 np0005626463.localdomain podman[310905]: 2026-02-23 09:55:35.898674404 +0000 UTC m=+0.067515446 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:55:35 np0005626463.localdomain podman[310905]: 2026-02-23 09:55:35.960425982 +0000 UTC m=+0.129266974 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: tmp-crun.yr3v2d.mount: Deactivated successfully.
Feb 23 09:55:35 np0005626463.localdomain podman[310906]: 2026-02-23 09:55:35.979341677 +0000 UTC m=+0.145761124 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:55:35 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:55:36 np0005626463.localdomain podman[310906]: 2026-02-23 09:55:36.011753229 +0000 UTC m=+0.178172656 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:55:36 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: pgmap v160: 177 pgs: 177 active+clean; 225 MiB data, 882 MiB used, 41 GiB / 42 GiB avail; 392 KiB/s rd, 2.6 MiB/s wr, 76 op/s
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: osdmap e109: 6 total, 6 up, 6 in
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e109 do_prune osdmap full prune enabled
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 e110: 6 total, 6 up, 6 in
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in
Feb 23 09:55:36 np0005626463.localdomain sshd[310952]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:55:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:37 np0005626463.localdomain sshd[310952]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:55:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:37.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:37 np0005626463.localdomain ceph-mon[294160]: osdmap e110: 6 total, 6 up, 6 in
Feb 23 09:55:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:37.766 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 do_prune osdmap full prune enabled
Feb 23 09:55:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e111 e111: 6 total, 6 up, 6 in
Feb 23 09:55:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in
Feb 23 09:55:38 np0005626463.localdomain ceph-mon[294160]: pgmap v163: 177 pgs: 177 active+clean; 225 MiB data, 882 MiB used, 41 GiB / 42 GiB avail; 513 KiB/s rd, 803 KiB/s wr, 183 op/s
Feb 23 09:55:38 np0005626463.localdomain ceph-mon[294160]: osdmap e111: 6 total, 6 up, 6 in
Feb 23 09:55:38 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:38.726 2 INFO neutron.agent.securitygroups_rpc [req-2c9e84fe-f5f5-4169-b610-b000c50ec955 req-5f426135-7d9d-4897-a8f1-4e578256ef9c b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['81d638c1-b5b2-4310-a6ad-c12f8ffa8182']
Feb 23 09:55:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:55:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:55:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:55:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1"
Feb 23 09:55:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:55:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19760 "" "Go-http-client/1.1"
Feb 23 09:55:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:39.719 2 INFO neutron.agent.securitygroups_rpc [req-2f94c104-6551-4f26-9e12-afea6d919b19 req-a08f01d8-ef48-4fc3-bf4b-c9f57e9a499f b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['81fcdb25-34e2-4e01-b6c6-c95398c61f96']
Feb 23 09:55:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:55:39 np0005626463.localdomain podman[310954]: 2026-02-23 09:55:39.915043292 +0000 UTC m=+0.082222431 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:39 np0005626463.localdomain podman[310954]: 2026-02-23 09:55:39.925666651 +0000 UTC m=+0.092845820 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:55:39 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:55:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e111 do_prune osdmap full prune enabled
Feb 23 09:55:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e112 e112: 6 total, 6 up, 6 in
Feb 23 09:55:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in
Feb 23 09:55:40 np0005626463.localdomain ceph-mon[294160]: pgmap v165: 177 pgs: 177 active+clean; 225 MiB data, 882 MiB used, 41 GiB / 42 GiB avail; 82 KiB/s rd, 30 KiB/s wr, 111 op/s
Feb 23 09:55:40 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:40.228 2 INFO neutron.agent.securitygroups_rpc [req-740eea5f-ed9c-433b-ad37-bd212433a1f7 req-c27bf517-3a13-4ed5-83bb-bf5a421e5259 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group member updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e112 do_prune osdmap full prune enabled
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: osdmap e112: 6 total, 6 up, 6 in
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/506515557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 e113: 6 total, 6 up, 6 in
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in
Feb 23 09:55:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:41.455 2 INFO neutron.agent.securitygroups_rpc [req-e8f689e1-cc88-4c2b-896f-8a7df5cfb707 req-464689d7-a67a-4f0a-8a5e-818033ca8861 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['dafd3ce0-31be-4a51-acc9-61744d386010']
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 do_prune osdmap full prune enabled
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e114 e114: 6 total, 6 up, 6 in
Feb 23 09:55:41 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: pgmap v167: 177 pgs: 177 active+clean; 193 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 35 KiB/s wr, 180 op/s
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: osdmap e113: 6 total, 6 up, 6 in
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2157903728' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: osdmap e114: 6 total, 6 up, 6 in
Feb 23 09:55:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:42.486 2 INFO neutron.agent.securitygroups_rpc [req-1b7ecdc5-863d-4ab9-b539-c81bbfea1261 req-49751f6e-ab71-482b-8a57-c1aca7f7d635 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['315ff60a-a295-4b8a-bcc8-fd8b624c828e']
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.768 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.826 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:42.827 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e114 do_prune osdmap full prune enabled
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e115 e115: 6 total, 6 up, 6 in
Feb 23 09:55:42 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in
Feb 23 09:55:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:55:43 np0005626463.localdomain podman[310976]: 2026-02-23 09:55:43.226745347 +0000 UTC m=+0.088502434 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:43 np0005626463.localdomain podman[310976]: 2026-02-23 09:55:43.232299909 +0000 UTC m=+0.094056926 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:43 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:55:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:43.247 265541 INFO neutron.agent.linux.ip_lib [None req-7079d0a7-0d2a-463c-8608-29e21df312c5 - - - - - -] Device tap2aa5a3c8-b2 cannot be used as it has no MAC address
Feb 23 09:55:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:43.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:43 np0005626463.localdomain kernel: device tap2aa5a3c8-b2 entered promiscuous mode
Feb 23 09:55:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:43.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:43 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840543.2873] manager: (tap2aa5a3c8-b2): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Feb 23 09:55:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:43Z|00137|binding|INFO|Claiming lport 2aa5a3c8-b285-43de-863b-75a8af32f886 for this chassis.
Feb 23 09:55:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:43Z|00138|binding|INFO|2aa5a3c8-b285-43de-863b-75a8af32f886: Claiming unknown
Feb 23 09:55:43 np0005626463.localdomain systemd-udevd[311002]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:55:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:43.306 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a5faee4-697a-4afe-96ef-26362544bf3c, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=2aa5a3c8-b285-43de-863b-75a8af32f886) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:43.308 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa5a3c8-b285-43de-863b-75a8af32f886 in datapath 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 bound to our chassis
Feb 23 09:55:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:43.310 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:43.312 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[6598eeb4-9108-4f74-8efc-665c1ff79dab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:43Z|00139|binding|INFO|Setting lport 2aa5a3c8-b285-43de-863b-75a8af32f886 ovn-installed in OVS
Feb 23 09:55:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:43Z|00140|binding|INFO|Setting lport 2aa5a3c8-b285-43de-863b-75a8af32f886 up in Southbound
Feb 23 09:55:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:43.323 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device
Feb 23 09:55:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:43.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:55:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:55:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:55:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:55:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:55:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:55:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:43.414 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e115 do_prune osdmap full prune enabled
Feb 23 09:55:43 np0005626463.localdomain ceph-mon[294160]: osdmap e115: 6 total, 6 up, 6 in
Feb 23 09:55:43 np0005626463.localdomain ceph-mon[294160]: pgmap v171: 177 pgs: 177 active+clean; 161 MiB data, 753 MiB used, 41 GiB / 42 GiB avail; 191 KiB/s rd, 2.2 MiB/s wr, 278 op/s
Feb 23 09:55:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e116 e116: 6 total, 6 up, 6 in
Feb 23 09:55:44 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in
Feb 23 09:55:44 np0005626463.localdomain podman[311071]: 
Feb 23 09:55:44 np0005626463.localdomain podman[311071]: 2026-02-23 09:55:44.225702021 +0000 UTC m=+0.096765561 container create 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:55:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope.
Feb 23 09:55:44 np0005626463.localdomain podman[311071]: 2026-02-23 09:55:44.178261205 +0000 UTC m=+0.049324785 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:55:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:55:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76ea2f986b386bcf7db108926dc2e34cd1d13749b6df6aaab1304793a77d4243/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:55:44 np0005626463.localdomain podman[311071]: 2026-02-23 09:55:44.323250305 +0000 UTC m=+0.194313845 container init 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:44 np0005626463.localdomain podman[311071]: 2026-02-23 09:55:44.331993515 +0000 UTC m=+0.203057015 container start 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: started, version 2.85 cachesize 150
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: DNS service limited to local subnets
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: warning: no upstream servers configured
Feb 23 09:55:44 np0005626463.localdomain dnsmasq-dhcp[311089]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/addn_hosts - 0 addresses
Feb 23 09:55:44 np0005626463.localdomain dnsmasq-dhcp[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/host
Feb 23 09:55:44 np0005626463.localdomain dnsmasq-dhcp[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/opts
Feb 23 09:55:44 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:44.356 2 INFO neutron.agent.securitygroups_rpc [req-66b36902-2e90-44c7-97eb-062607295697 req-4da6358e-447d-4c62-9578-e6f0834d0e3a b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']
Feb 23 09:55:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.503 265541 INFO neutron.agent.dhcp.agent [None req-0228ef53-fe82-400d-a28a-61496c4c13ac - - - - - -] DHCP configuration for ports {'badfeffc-25c8-4fd8-98d9-84c11f857262'} is completed
Feb 23 09:55:44 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:44.710 2 INFO neutron.agent.securitygroups_rpc [req-06a39d77-6c9e-4d1d-991d-0c622d1b8570 req-e582e36b-a1ce-42ef-aae5-7f3dd95ea0e7 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']
Feb 23 09:55:44 np0005626463.localdomain dnsmasq[311089]: exiting on receipt of SIGTERM
Feb 23 09:55:44 np0005626463.localdomain podman[311108]: 2026-02-23 09:55:44.718749014 +0000 UTC m=+0.060211352 container kill 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:55:44 np0005626463.localdomain systemd[1]: libpod-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope: Deactivated successfully.
Feb 23 09:55:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:44.760 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 101d2419-fd78-4ca6-a2ca-7ccf1bd2d588 with type ""
Feb 23 09:55:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:44.762 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a5faee4-697a-4afe-96ef-26362544bf3c, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=2aa5a3c8-b285-43de-863b-75a8af32f886) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:44 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:44Z|00141|binding|INFO|Removing iface tap2aa5a3c8-b2 ovn-installed in OVS
Feb 23 09:55:44 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:44Z|00142|binding|INFO|Removing lport 2aa5a3c8-b285-43de-863b-75a8af32f886 ovn-installed in OVS
Feb 23 09:55:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:44.763 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa5a3c8-b285-43de-863b-75a8af32f886 in datapath 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 unbound from our chassis
Feb 23 09:55:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:44.764 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:44.764 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[89cdfb2a-c016-4139-a26c-bcba61b9b990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:44.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:44.771 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:44 np0005626463.localdomain podman[311122]: 2026-02-23 09:55:44.796512316 +0000 UTC m=+0.062147871 container died 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:44 np0005626463.localdomain podman[311122]: 2026-02-23 09:55:44.882215524 +0000 UTC m=+0.147851059 container cleanup 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:44 np0005626463.localdomain systemd[1]: libpod-conmon-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope: Deactivated successfully.
Feb 23 09:55:44 np0005626463.localdomain podman[311124]: 2026-02-23 09:55:44.911211259 +0000 UTC m=+0.171006084 container remove 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 09:55:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:44.923 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:44 np0005626463.localdomain kernel: device tap2aa5a3c8-b2 left promiscuous mode
Feb 23 09:55:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:44.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:44 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:44Z|00143|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.960 265541 INFO neutron.agent.dhcp.agent [None req-bebac8b9-8225-4d6f-82da-0e727140ba2e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.961 265541 INFO neutron.agent.dhcp.agent [None req-bebac8b9-8225-4d6f-82da-0e727140ba2e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e116 do_prune osdmap full prune enabled
Feb 23 09:55:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:45.008 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:45 np0005626463.localdomain ceph-mon[294160]: osdmap e116: 6 total, 6 up, 6 in
Feb 23 09:55:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e117 e117: 6 total, 6 up, 6 in
Feb 23 09:55:45 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in
Feb 23 09:55:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-76ea2f986b386bcf7db108926dc2e34cd1d13749b6df6aaab1304793a77d4243-merged.mount: Deactivated successfully.
Feb 23 09:55:45 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99-userdata-shm.mount: Deactivated successfully.
Feb 23 09:55:45 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d03fa0e8f\x2daf23\x2d4fd5\x2daa8c\x2d5de2330e1869.mount: Deactivated successfully.
Feb 23 09:55:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:45.352 2 INFO neutron.agent.securitygroups_rpc [req-b47c5ca0-09bc-4b25-97b3-180f5e0f18ac req-fd279825-b764-4bf6-b8d3-f7613a7508ce b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e117 do_prune osdmap full prune enabled
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: osdmap e117: 6 total, 6 up, 6 in
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: pgmap v174: 177 pgs: 177 active+clean; 161 MiB data, 753 MiB used, 41 GiB / 42 GiB avail; 119 KiB/s rd, 2.2 MiB/s wr, 182 op/s
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2159369974' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2924604571' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 e118: 6 total, 6 up, 6 in
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in
Feb 23 09:55:46 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:46Z|00144|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:46.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 do_prune osdmap full prune enabled
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e119 e119: 6 total, 6 up, 6 in
Feb 23 09:55:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in
Feb 23 09:55:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:47.027 2 INFO neutron.agent.securitygroups_rpc [None req-f20a6c5c-ae1e-41e5-8a0b-e142fc8dd656 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:47 np0005626463.localdomain ceph-mon[294160]: osdmap e118: 6 total, 6 up, 6 in
Feb 23 09:55:47 np0005626463.localdomain ceph-mon[294160]: osdmap e119: 6 total, 6 up, 6 in
Feb 23 09:55:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:47.828 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:47.830 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e119 do_prune osdmap full prune enabled
Feb 23 09:55:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e120 e120: 6 total, 6 up, 6 in
Feb 23 09:55:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in
Feb 23 09:55:48 np0005626463.localdomain ceph-mon[294160]: pgmap v177: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 3.2 MiB/s wr, 188 op/s
Feb 23 09:55:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:48.503 2 INFO neutron.agent.securitygroups_rpc [None req-1ad0cbd3-986c-404f-b323-25b4bb76d296 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:55:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:55:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:55:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:48.673 2 INFO neutron.agent.securitygroups_rpc [None req-1ad0cbd3-986c-404f-b323-25b4bb76d296 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e120 do_prune osdmap full prune enabled
Feb 23 09:55:49 np0005626463.localdomain ceph-mon[294160]: osdmap e120: 6 total, 6 up, 6 in
Feb 23 09:55:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e121 e121: 6 total, 6 up, 6 in
Feb 23 09:55:49 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in
Feb 23 09:55:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:49.287 2 INFO neutron.agent.securitygroups_rpc [None req-2c61be2a-9b0c-4b49-b66c-1a782048b54f 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:49.313 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e121 do_prune osdmap full prune enabled
Feb 23 09:55:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e122 e122: 6 total, 6 up, 6 in
Feb 23 09:55:50 np0005626463.localdomain ceph-mon[294160]: pgmap v179: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 3.1 MiB/s wr, 184 op/s
Feb 23 09:55:50 np0005626463.localdomain ceph-mon[294160]: osdmap e121: 6 total, 6 up, 6 in
Feb 23 09:55:50 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in
Feb 23 09:55:50 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:50.433 2 INFO neutron.agent.securitygroups_rpc [None req-0bde3ec4-1f43-4d89-82d0-564202a4897c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:51.117 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e122 do_prune osdmap full prune enabled
Feb 23 09:55:51 np0005626463.localdomain ceph-mon[294160]: osdmap e122: 6 total, 6 up, 6 in
Feb 23 09:55:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 e123: 6 total, 6 up, 6 in
Feb 23 09:55:51 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in
Feb 23 09:55:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:52 np0005626463.localdomain ceph-mon[294160]: pgmap v182: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 38 KiB/s wr, 172 op/s
Feb 23 09:55:52 np0005626463.localdomain ceph-mon[294160]: osdmap e123: 6 total, 6 up, 6 in
Feb 23 09:55:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:52Z|00145|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:52.815 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:52 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:55:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:52.829 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:52.834 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:52 np0005626463.localdomain systemd[1]: tmp-crun.qFacVN.mount: Deactivated successfully.
Feb 23 09:55:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 do_prune osdmap full prune enabled
Feb 23 09:55:52 np0005626463.localdomain podman[311152]: 2026-02-23 09:55:52.924584758 +0000 UTC m=+0.092913811 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:55:52 np0005626463.localdomain podman[311152]: 2026-02-23 09:55:52.961364964 +0000 UTC m=+0.129694067 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:55:52 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:55:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e124 e124: 6 total, 6 up, 6 in
Feb 23 09:55:53 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in
Feb 23 09:55:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:53.975 265541 INFO neutron.agent.linux.ip_lib [None req-81e8d427-46c9-456f-9ee1-b5d20f69f376 - - - - - -] Device tape39b42ef-29 cannot be used as it has no MAC address
Feb 23 09:55:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:53.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:54 np0005626463.localdomain kernel: device tape39b42ef-29 entered promiscuous mode
Feb 23 09:55:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:54Z|00146|binding|INFO|Claiming lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f for this chassis.
Feb 23 09:55:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:54Z|00147|binding|INFO|e39b42ef-2915-4d7d-bb0f-f93a8d18df3f: Claiming unknown
Feb 23 09:55:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:54.004 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:54 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840554.0082] manager: (tape39b42ef-29): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Feb 23 09:55:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e124 do_prune osdmap full prune enabled
Feb 23 09:55:54 np0005626463.localdomain systemd-udevd[311186]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:55:54 np0005626463.localdomain ceph-mon[294160]: osdmap e124: 6 total, 6 up, 6 in
Feb 23 09:55:54 np0005626463.localdomain ceph-mon[294160]: pgmap v185: 177 pgs: 177 active+clean; 192 MiB data, 816 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 51 KiB/s wr, 465 op/s
Feb 23 09:55:54 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:54.033 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ad6b648-6ffe-4ae2-bf96-781afc55b826, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=e39b42ef-2915-4d7d-bb0f-f93a8d18df3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:54 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:54.034 163572 INFO neutron.agent.ovn.metadata.agent [-] Port e39b42ef-2915-4d7d-bb0f-f93a8d18df3f in datapath 462b226e-df7b-4026-91be-fef5d89fea0c bound to our chassis
Feb 23 09:55:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 e125: 6 total, 6 up, 6 in
Feb 23 09:55:54 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:54.037 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 462b226e-df7b-4026-91be-fef5d89fea0c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:55:54 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:54.037 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dee78a72-12b7-45c2-a6d3-38a83ce6dc83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:54 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in
Feb 23 09:55:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:54Z|00148|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f ovn-installed in OVS
Feb 23 09:55:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:54Z|00149|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f up in Southbound
Feb 23 09:55:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:54.080 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:54.090 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:54.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:55 np0005626463.localdomain podman[311241]: 
Feb 23 09:55:55 np0005626463.localdomain podman[311241]: 2026-02-23 09:55:55.024278059 +0000 UTC m=+0.093846100 container create 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:55 np0005626463.localdomain ceph-mon[294160]: osdmap e125: 6 total, 6 up, 6 in
Feb 23 09:55:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:55:55 np0005626463.localdomain systemd[1]: Started libpod-conmon-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope.
Feb 23 09:55:55 np0005626463.localdomain podman[311241]: 2026-02-23 09:55:54.978061851 +0000 UTC m=+0.047629912 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:55:55 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:55:55 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/244af4fa1319ccb844fecc1fd3464421321a9916252948b6f4a6c8b69c4eec9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:55:55 np0005626463.localdomain podman[311241]: 2026-02-23 09:55:55.103822587 +0000 UTC m=+0.173390618 container init 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:55 np0005626463.localdomain podman[311241]: 2026-02-23 09:55:55.114730424 +0000 UTC m=+0.184298455 container start 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:55:55 np0005626463.localdomain dnsmasq[311270]: started, version 2.85 cachesize 150
Feb 23 09:55:55 np0005626463.localdomain dnsmasq[311270]: DNS service limited to local subnets
Feb 23 09:55:55 np0005626463.localdomain dnsmasq[311270]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:55:55 np0005626463.localdomain dnsmasq[311270]: warning: no upstream servers configured
Feb 23 09:55:55 np0005626463.localdomain dnsmasq-dhcp[311270]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:55:55 np0005626463.localdomain dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 0 addresses
Feb 23 09:55:55 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host
Feb 23 09:55:55 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts
Feb 23 09:55:55 np0005626463.localdomain podman[311255]: 2026-02-23 09:55:55.159154487 +0000 UTC m=+0.088415013 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 09:55:55 np0005626463.localdomain podman[311255]: 2026-02-23 09:55:55.200024529 +0000 UTC m=+0.129285075 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Feb 23 09:55:55 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:55:55 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:55.318 265541 INFO neutron.agent.dhcp.agent [None req-f1e60851-382a-47cf-8cc4-b5b23388d7f1 - - - - - -] DHCP configuration for ports {'3f0b7505-ef5e-4e16-9dc2-7cc00255b2e3'} is completed
Feb 23 09:55:56 np0005626463.localdomain ceph-mon[294160]: pgmap v187: 177 pgs: 177 active+clean; 192 MiB data, 816 MiB used, 41 GiB / 42 GiB avail; 4.8 MiB/s rd, 41 KiB/s wr, 374 op/s
Feb 23 09:55:56 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:56.398 2 INFO neutron.agent.securitygroups_rpc [None req-680f6195-a81a-4ace-93bb-ca63b4542035 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:56.436 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829300a60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829300e20>], id=ee23af99-5af9-4088-9933-048b02e82885, ip_allocation=immediate, mac_address=fa:16:3e:c7:c9:6f, name=tempest-PortsTestJSON-272504090, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:51Z, description=, dns_domain=, id=462b226e-df7b-4026-91be-fef5d89fea0c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-445099310, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37265, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1186, status=ACTIVE, subnets=['c82bacf0-e9f5-4b47-ab92-0189f37d0778'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:55:53Z, vlan_transparent=None, network_id=462b226e-df7b-4026-91be-fef5d89fea0c, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:55:56Z on network 462b226e-df7b-4026-91be-fef5d89fea0c
Feb 23 09:55:56 np0005626463.localdomain podman[311295]: 2026-02-23 09:55:56.791501188 +0000 UTC m=+0.062850662 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 09:55:56 np0005626463.localdomain dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 1 addresses
Feb 23 09:55:56 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host
Feb 23 09:55:56 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts
Feb 23 09:55:56 np0005626463.localdomain systemd[1]: tmp-crun.j6tApH.mount: Deactivated successfully.
Feb 23 09:55:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:55:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 do_prune osdmap full prune enabled
Feb 23 09:55:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 e126: 6 total, 6 up, 6 in
Feb 23 09:55:56 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in
Feb 23 09:55:57 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:57.087 265541 INFO neutron.agent.dhcp.agent [None req-39c42241-d377-4afd-9487-6b705568ac4c - - - - - -] DHCP configuration for ports {'ee23af99-5af9-4088-9933-048b02e82885'} is completed
Feb 23 09:55:57 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:55:57.365 2 INFO neutron.agent.securitygroups_rpc [None req-e495f557-19e6-4684-b099-a0de07319228 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:55:57 np0005626463.localdomain dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 0 addresses
Feb 23 09:55:57 np0005626463.localdomain podman[311332]: 2026-02-23 09:55:57.592071502 +0000 UTC m=+0.042495853 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:55:57 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host
Feb 23 09:55:57 np0005626463.localdomain dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts
Feb 23 09:55:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:57.832 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:57.837 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:57 np0005626463.localdomain ceph-mon[294160]: osdmap e126: 6 total, 6 up, 6 in
Feb 23 09:55:57 np0005626463.localdomain ceph-mon[294160]: pgmap v189: 177 pgs: 177 active+clean; 192 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 12 KiB/s wr, 285 op/s
Feb 23 09:55:58 np0005626463.localdomain systemd[1]: tmp-crun.JAlg08.mount: Deactivated successfully.
Feb 23 09:55:58 np0005626463.localdomain dnsmasq[311270]: exiting on receipt of SIGTERM
Feb 23 09:55:58 np0005626463.localdomain podman[311370]: 2026-02-23 09:55:58.222238182 +0000 UTC m=+0.081252602 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:55:58 np0005626463.localdomain systemd[1]: libpod-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope: Deactivated successfully.
Feb 23 09:55:58 np0005626463.localdomain podman[311384]: 2026-02-23 09:55:58.290904054 +0000 UTC m=+0.056793287 container died 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:55:58 np0005626463.localdomain podman[311384]: 2026-02-23 09:55:58.326853974 +0000 UTC m=+0.092743167 container cleanup 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 09:55:58 np0005626463.localdomain systemd[1]: libpod-conmon-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope: Deactivated successfully.
Feb 23 09:55:58 np0005626463.localdomain podman[311386]: 2026-02-23 09:55:58.370270745 +0000 UTC m=+0.128675146 container remove 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:55:58 np0005626463.localdomain kernel: device tape39b42ef-29 left promiscuous mode
Feb 23 09:55:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:58.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:58Z|00150|binding|INFO|Releasing lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f from this chassis (sb_readonly=0)
Feb 23 09:55:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:58Z|00151|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f down in Southbound
Feb 23 09:55:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:58.433 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ad6b648-6ffe-4ae2-bf96-781afc55b826, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=e39b42ef-2915-4d7d-bb0f-f93a8d18df3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:55:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:58.436 163572 INFO neutron.agent.ovn.metadata.agent [-] Port e39b42ef-2915-4d7d-bb0f-f93a8d18df3f in datapath 462b226e-df7b-4026-91be-fef5d89fea0c unbound from our chassis
Feb 23 09:55:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:58.440 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 462b226e-df7b-4026-91be-fef5d89fea0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:55:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:58.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:55:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:55:58.441 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[db23f5ee-c122-4895-8b60-38c829ce0e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:55:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-244af4fa1319ccb844fecc1fd3464421321a9916252948b6f4a6c8b69c4eec9a-merged.mount: Deactivated successfully.
Feb 23 09:55:58 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7-userdata-shm.mount: Deactivated successfully.
Feb 23 09:55:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.008 265541 INFO neutron.agent.dhcp.agent [None req-0c8823e9-1756-435e-8972-19e10f09bef2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.009 265541 INFO neutron.agent.dhcp.agent [None req-0c8823e9-1756-435e-8972-19e10f09bef2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:59 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d462b226e\x2ddf7b\x2d4026\x2d91be\x2dfef5d89fea0c.mount: Deactivated successfully.
Feb 23 09:55:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.311 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:55:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:55:59Z|00152|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:55:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:55:59.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:00 np0005626463.localdomain ceph-mon[294160]: pgmap v190: 177 pgs: 177 active+clean; 192 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 4.6 KiB/s wr, 92 op/s
Feb 23 09:56:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 do_prune osdmap full prune enabled
Feb 23 09:56:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 e127: 6 total, 6 up, 6 in
Feb 23 09:56:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in
Feb 23 09:56:02 np0005626463.localdomain ceph-mon[294160]: pgmap v191: 177 pgs: 177 active+clean; 202 MiB data, 885 MiB used, 41 GiB / 42 GiB avail; 296 KiB/s rd, 1.5 MiB/s wr, 113 op/s
Feb 23 09:56:02 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3639386327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:02 np0005626463.localdomain ceph-mon[294160]: osdmap e127: 6 total, 6 up, 6 in
Feb 23 09:56:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:02.834 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:02.838 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:03 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3168391536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:04 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:04.001 265541 INFO neutron.agent.linux.ip_lib [None req-adac8943-853d-42f8-b694-c25be0846689 - - - - - -] Device tap635b363d-ef cannot be used as it has no MAC address
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.023 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain kernel: device tap635b363d-ef entered promiscuous mode
Feb 23 09:56:04 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840564.0316] manager: (tap635b363d-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.032 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain systemd-udevd[311423]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:04Z|00153|binding|INFO|Claiming lport 635b363d-ef8c-4e25-843f-da965f86fee0 for this chassis.
Feb 23 09:56:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:04Z|00154|binding|INFO|635b363d-ef8c-4e25-843f-da965f86fee0: Claiming unknown
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.066 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:04Z|00155|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 ovn-installed in OVS
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.069 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.070 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap635b363d-ef: No such device
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.105 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.130 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:04Z|00156|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 up in Southbound
Feb 23 09:56:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:04.207 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6302580f-701e-45c5-96d0-5d526435f898, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=635b363d-ef8c-4e25-843f-da965f86fee0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:04.209 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 635b363d-ef8c-4e25-843f-da965f86fee0 in datapath a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 bound to our chassis
Feb 23 09:56:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:04.211 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:04.212 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dcbd8d-edd9-4485-8c00-5508df5fd057]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:04 np0005626463.localdomain ceph-mon[294160]: pgmap v193: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 541 KiB/s rd, 3.2 MiB/s wr, 164 op/s
Feb 23 09:56:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3455665244' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:56:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3455665244' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:56:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:04.254 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:05 np0005626463.localdomain podman[311494]: 
Feb 23 09:56:05 np0005626463.localdomain podman[311494]: 2026-02-23 09:56:05.259807161 +0000 UTC m=+0.091125346 container create 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:56:05 np0005626463.localdomain systemd[1]: Started libpod-conmon-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope.
Feb 23 09:56:05 np0005626463.localdomain podman[311494]: 2026-02-23 09:56:05.215299166 +0000 UTC m=+0.046617361 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:05 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:05 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7238c05bef8095068d10408be5e9c281faac1078934f1d2694534ecaf86cdf18/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:05 np0005626463.localdomain podman[311494]: 2026-02-23 09:56:05.332526767 +0000 UTC m=+0.163844932 container init 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:05 np0005626463.localdomain podman[311494]: 2026-02-23 09:56:05.342547708 +0000 UTC m=+0.173865873 container start 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:56:05 np0005626463.localdomain dnsmasq[311512]: started, version 2.85 cachesize 150
Feb 23 09:56:05 np0005626463.localdomain dnsmasq[311512]: DNS service limited to local subnets
Feb 23 09:56:05 np0005626463.localdomain dnsmasq[311512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:05 np0005626463.localdomain dnsmasq[311512]: warning: no upstream servers configured
Feb 23 09:56:05 np0005626463.localdomain dnsmasq-dhcp[311512]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:56:05 np0005626463.localdomain dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 0 addresses
Feb 23 09:56:05 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host
Feb 23 09:56:05 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts
Feb 23 09:56:05 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:05.653 265541 INFO neutron.agent.dhcp.agent [None req-6782edd9-f124-453d-a0ec-26d036c007c0 - - - - - -] DHCP configuration for ports {'4f6e98e8-9bd4-4004-9cb5-c86e3901ec62'} is completed
Feb 23 09:56:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:56:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:56:06 np0005626463.localdomain podman[311514]: 2026-02-23 09:56:06.152102248 +0000 UTC m=+0.072455559 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:56:06 np0005626463.localdomain podman[311514]: 2026-02-23 09:56:06.161534949 +0000 UTC m=+0.081888320 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:56:06 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:56:06 np0005626463.localdomain podman[311513]: 2026-02-23 09:56:06.207179769 +0000 UTC m=+0.130488922 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:56:06 np0005626463.localdomain ceph-mon[294160]: pgmap v194: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 478 KiB/s rd, 3.1 MiB/s wr, 92 op/s
Feb 23 09:56:06 np0005626463.localdomain podman[311513]: 2026-02-23 09:56:06.279348069 +0000 UTC m=+0.202657272 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 09:56:06 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:56:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:07.208 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:07.835 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:07.841 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:07 np0005626463.localdomain ceph-mon[294160]: pgmap v195: 177 pgs: 177 active+clean; 225 MiB data, 917 MiB used, 41 GiB / 42 GiB avail; 391 KiB/s rd, 2.6 MiB/s wr, 75 op/s
Feb 23 09:56:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.066 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:07Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294e3340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294e3970>], id=b540ec8b-5a2e-4ae4-8a26-f1d68cdc922c, ip_allocation=immediate, mac_address=fa:16:3e:eb:dc:34, name=tempest-PortsTestJSON-835694374, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:00Z, description=, dns_domain=, id=a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-582323247, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44895, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1258, status=ACTIVE, subnets=['0067f710-f96d-4ecf-888a-9e2b98e326fd'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:02Z, vlan_transparent=None, network_id=a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1297, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:07Z on network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0
Feb 23 09:56:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.216 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:08 np0005626463.localdomain dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 1 addresses
Feb 23 09:56:08 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host
Feb 23 09:56:08 np0005626463.localdomain podman[311579]: 2026-02-23 09:56:08.355483322 +0000 UTC m=+0.060243681 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:56:08 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts
Feb 23 09:56:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.607 265541 INFO neutron.agent.dhcp.agent [None req-af2cc72e-afdd-4e3b-a3c2-9fdfc5a27c76 - - - - - -] DHCP configuration for ports {'b540ec8b-5a2e-4ae4-8a26-f1d68cdc922c'} is completed
Feb 23 09:56:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:56:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:56:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:56:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162553 "" "Go-http-client/1.1"
Feb 23 09:56:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:56:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20244 "" "Go-http-client/1.1"
Feb 23 09:56:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:09.556 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:09.557 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:56:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:09.598 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:09 np0005626463.localdomain dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 0 addresses
Feb 23 09:56:09 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host
Feb 23 09:56:09 np0005626463.localdomain dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts
Feb 23 09:56:09 np0005626463.localdomain podman[311616]: 2026-02-23 09:56:09.866737114 +0000 UTC m=+0.063560665 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:56:09 np0005626463.localdomain systemd[1]: tmp-crun.Nq1gu7.mount: Deactivated successfully.
Feb 23 09:56:10 np0005626463.localdomain ceph-mon[294160]: pgmap v196: 177 pgs: 177 active+clean; 225 MiB data, 917 MiB used, 41 GiB / 42 GiB avail; 391 KiB/s rd, 2.6 MiB/s wr, 75 op/s
Feb 23 09:56:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:56:10 np0005626463.localdomain podman[311637]: 2026-02-23 09:56:10.905229678 +0000 UTC m=+0.081391656 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:56:10 np0005626463.localdomain podman[311637]: 2026-02-23 09:56:10.9156334 +0000 UTC m=+0.091795378 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:56:10 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:56:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:56:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 3171 writes, 26K keys, 3171 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s
                                                           Cumulative WAL: 3171 writes, 3171 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 3171 writes, 26K keys, 3171 commit groups, 1.0 writes per commit group, ingest: 48.04 MB, 0.08 MB/s
                                                           Interval WAL: 3171 writes, 3171 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    151.6      0.23              0.09        12    0.019       0      0       0.0       0.0
                                                             L6      1/0   17.05 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.2    155.7    141.7      1.28              0.48        11    0.116    129K   5605       0.0       0.0
                                                            Sum      1/0   17.05 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   6.2    131.9    143.2      1.51              0.57        23    0.066    129K   5605       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   6.2    132.2    143.5      1.51              0.57        22    0.068    129K   5605       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    155.7    141.7      1.28              0.48        11    0.116    129K   5605       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    153.8      0.23              0.09        11    0.021       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.034, interval 0.034
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.5 seconds
                                                           Interval compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.5 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 308.00 MB usage: 33.10 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000317 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(2166,32.23 MB,10.4648%) FilterBlock(23,375.17 KB,0.118954%) IndexBlock(23,512.08 KB,0.162362%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 23 09:56:11 np0005626463.localdomain dnsmasq[311512]: exiting on receipt of SIGTERM
Feb 23 09:56:11 np0005626463.localdomain systemd[1]: libpod-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope: Deactivated successfully.
Feb 23 09:56:11 np0005626463.localdomain podman[311671]: 2026-02-23 09:56:11.591007316 +0000 UTC m=+0.056424705 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 09:56:11 np0005626463.localdomain podman[311683]: 2026-02-23 09:56:11.668804359 +0000 UTC m=+0.061472560 container died 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:11 np0005626463.localdomain podman[311683]: 2026-02-23 09:56:11.697398843 +0000 UTC m=+0.090067004 container cleanup 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:11 np0005626463.localdomain systemd[1]: libpod-conmon-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope: Deactivated successfully.
Feb 23 09:56:11 np0005626463.localdomain podman[311685]: 2026-02-23 09:56:11.742470535 +0000 UTC m=+0.129989877 container remove 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:11.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:11 np0005626463.localdomain kernel: device tap635b363d-ef left promiscuous mode
Feb 23 09:56:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:11Z|00157|binding|INFO|Releasing lport 635b363d-ef8c-4e25-843f-da965f86fee0 from this chassis (sb_readonly=0)
Feb 23 09:56:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:11Z|00158|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 down in Southbound
Feb 23 09:56:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:11.803 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6302580f-701e-45c5-96d0-5d526435f898, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=635b363d-ef8c-4e25-843f-da965f86fee0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:11.804 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 635b363d-ef8c-4e25-843f-da965f86fee0 in datapath a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 unbound from our chassis
Feb 23 09:56:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:11.808 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:56:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:11.809 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e95a620b-b8dc-49bb-9e35-ae4ffc346a86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:11.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-7238c05bef8095068d10408be5e9c281faac1078934f1d2694534ecaf86cdf18-merged.mount: Deactivated successfully.
Feb 23 09:56:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.075 265541 INFO neutron.agent.dhcp.agent [None req-76bdfc34-db0d-4a92-8381-da44c2c45924 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.075 265541 INFO neutron.agent.dhcp.agent [None req-76bdfc34-db0d-4a92-8381-da44c2c45924 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:12 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2da74642b2\x2ddd5d\x2d4d6b\x2db98a\x2d2a45bd6773c0.mount: Deactivated successfully.
Feb 23 09:56:12 np0005626463.localdomain ceph-mon[294160]: pgmap v197: 177 pgs: 177 active+clean; 190 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 206 KiB/s rd, 1.3 MiB/s wr, 54 op/s
Feb 23 09:56:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.446 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:12.766 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:12Z|00159|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:12.813 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:12.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:56:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:56:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:56:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:56:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:56:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:56:13 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:13.401 2 INFO neutron.agent.securitygroups_rpc [None req-d12b9e97-0a30-4af0-bab2-d9a3d950dae1 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:56:13 np0005626463.localdomain systemd[1]: tmp-crun.gkeYSB.mount: Deactivated successfully.
Feb 23 09:56:13 np0005626463.localdomain podman[311714]: 2026-02-23 09:56:13.91641726 +0000 UTC m=+0.091573129 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:56:13 np0005626463.localdomain podman[311714]: 2026-02-23 09:56:13.925278754 +0000 UTC m=+0.100434613 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 09:56:13 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:56:14 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:14.159 2 INFO neutron.agent.securitygroups_rpc [None req-2b12f6d0-e3fd-4fa1-a330-93a66177eb38 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:14 np0005626463.localdomain ceph-mon[294160]: pgmap v198: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 15 KiB/s wr, 30 op/s
Feb 23 09:56:14 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:14.691 2 INFO neutron.agent.securitygroups_rpc [None req-313a7d3e-1b0f-4380-96f5-be20bc42956f 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:15.559 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:56:15 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:15.661 2 INFO neutron.agent.securitygroups_rpc [None req-ab6e9fb7-3784-4829-9f74-5b432c230863 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:16 np0005626463.localdomain ceph-mon[294160]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 28 op/s
Feb 23 09:56:16 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:16.418 2 INFO neutron.agent.securitygroups_rpc [None req-f9178f29-327a-4b87-b505-9a750a3f52d0 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:17 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:17.377 2 INFO neutron.agent.securitygroups_rpc [None req-9798c79a-b835-452b-b3e7-ba6f51410008 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:17 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:17.482 2 INFO neutron.agent.securitygroups_rpc [None req-266d671f-bdaf-4cc0-a88f-fea21e1850b2 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:17.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:17.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:17.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:17 np0005626463.localdomain sudo[311732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:56:17 np0005626463.localdomain sudo[311732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:56:17 np0005626463.localdomain sudo[311732]: pam_unix(sudo:session): session closed for user root
Feb 23 09:56:17 np0005626463.localdomain ceph-mon[294160]: pgmap v200: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 28 op/s
Feb 23 09:56:18 np0005626463.localdomain sudo[311750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:56:18 np0005626463.localdomain sudo[311750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:56:18 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:18.323 2 INFO neutron.agent.securitygroups_rpc [None req-aa9784b4-8658-4ac5-a544-4839237cb0a4 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']
Feb 23 09:56:18 np0005626463.localdomain sudo[311750]: pam_unix(sudo:session): session closed for user root
Feb 23 09:56:18 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:18.733 2 INFO neutron.agent.securitygroups_rpc [None req-40589676-1ef1-47e5-81ec-92f9fb0c6844 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:56:18 np0005626463.localdomain sudo[311800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:56:18 np0005626463.localdomain sudo[311800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:56:18 np0005626463.localdomain sudo[311800]: pam_unix(sudo:session): session closed for user root
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:56:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:56:19 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:19.212 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:19 np0005626463.localdomain ceph-mon[294160]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 23 09:56:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:56:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.189 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.189 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:56:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:21Z|00160|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.470 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:56:21 np0005626463.localdomain ceph-mon[294160]: pgmap v202: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 23 09:56:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4284316826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:21.787 265541 INFO neutron.agent.linux.ip_lib [None req-821e9013-c504-4be4-94d4-01337e729fb3 - - - - - -] Device tap372a9673-6c cannot be used as it has no MAC address
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.842 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:21 np0005626463.localdomain kernel: device tap372a9673-6c entered promiscuous mode
Feb 23 09:56:21 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840581.8519] manager: (tap372a9673-6c): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Feb 23 09:56:21 np0005626463.localdomain systemd-udevd[311828]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.855 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:21Z|00161|binding|INFO|Claiming lport 372a9673-6c0e-49dd-9d35-ac2275c153ff for this chassis.
Feb 23 09:56:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:21Z|00162|binding|INFO|372a9673-6c0e-49dd-9d35-ac2275c153ff: Claiming unknown
Feb 23 09:56:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:21.866 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3af6da47-311a-4978-bae3-0b17a56bce02, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=372a9673-6c0e-49dd-9d35-ac2275c153ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:21.867 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 372a9673-6c0e-49dd-9d35-ac2275c153ff in datapath 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 bound to our chassis
Feb 23 09:56:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:21.869 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:21.870 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce18b77-18f8-4ea5-bb39-80b2a593e706]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:21Z|00163|binding|INFO|Setting lport 372a9673-6c0e-49dd-9d35-ac2275c153ff ovn-installed in OVS
Feb 23 09:56:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:21Z|00164|binding|INFO|Setting lport 372a9673-6c0e-49dd-9d35-ac2275c153ff up in Southbound
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap372a9673-6c: No such device
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:21.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:22.184 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port de3bbf63-31eb-40e8-b51e-d7191f3813e3 with type ""
Feb 23 09:56:22 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:22Z|00165|binding|INFO|Removing iface tap372a9673-6c ovn-installed in OVS
Feb 23 09:56:22 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:22Z|00166|binding|INFO|Removing lport 372a9673-6c0e-49dd-9d35-ac2275c153ff ovn-installed in OVS
Feb 23 09:56:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:22.186 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3af6da47-311a-4978-bae3-0b17a56bce02, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=372a9673-6c0e-49dd-9d35-ac2275c153ff) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:22.188 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 372a9673-6c0e-49dd-9d35-ac2275c153ff in datapath 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 unbound from our chassis
Feb 23 09:56:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:22.191 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:22.191 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e74a35cf-a7e0-4097-b28d-bfc67f0dbfaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:22 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:22.573 2 INFO neutron.agent.securitygroups_rpc [None req-0e752e7e-c507-42b6-b334-815c25dce29c 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']
Feb 23 09:56:22 np0005626463.localdomain podman[311899]: 
Feb 23 09:56:22 np0005626463.localdomain podman[311899]: 2026-02-23 09:56:22.716011627 +0000 UTC m=+0.077397421 container create db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:56:22 np0005626463.localdomain systemd[1]: Started libpod-conmon-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope.
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.761 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:56:22 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.776 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.777 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:56:22 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd10a276b25a13959ac50d6133201d49913bb2a1ba87afd357e7af4f3a8b1bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:22 np0005626463.localdomain podman[311899]: 2026-02-23 09:56:22.682399039 +0000 UTC m=+0.043784773 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:22 np0005626463.localdomain podman[311899]: 2026-02-23 09:56:22.790114867 +0000 UTC m=+0.151500611 container init db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:22 np0005626463.localdomain podman[311899]: 2026-02-23 09:56:22.800311973 +0000 UTC m=+0.161697707 container start db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 09:56:22 np0005626463.localdomain dnsmasq[311917]: started, version 2.85 cachesize 150
Feb 23 09:56:22 np0005626463.localdomain dnsmasq[311917]: DNS service limited to local subnets
Feb 23 09:56:22 np0005626463.localdomain dnsmasq[311917]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:22 np0005626463.localdomain dnsmasq[311917]: warning: no upstream servers configured
Feb 23 09:56:22 np0005626463.localdomain dnsmasq-dhcp[311917]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:56:22 np0005626463.localdomain dnsmasq[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/addn_hosts - 0 addresses
Feb 23 09:56:22 np0005626463.localdomain dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/host
Feb 23 09:56:22 np0005626463.localdomain dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/opts
Feb 23 09:56:22 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:22.903 265541 INFO neutron.agent.dhcp.agent [None req-1ccbce1c-4cea-49df-9178-1b829f03f556 - - - - - -] DHCP configuration for ports {'9bb465c3-7b01-4dc3-9b6f-a528439f0d87'} is completed
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:22 np0005626463.localdomain kernel: device tap372a9673-6c left promiscuous mode
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:56:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:22.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:22 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1198659923' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:22 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1232735857' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:23 np0005626463.localdomain dnsmasq[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/addn_hosts - 0 addresses
Feb 23 09:56:23 np0005626463.localdomain dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/host
Feb 23 09:56:23 np0005626463.localdomain dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/opts
Feb 23 09:56:23 np0005626463.localdomain podman[311935]: 2026-02-23 09:56:23.110948349 +0000 UTC m=+0.061418359 container kill db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent [None req-821e9013-c504-4be4-94d4-01337e729fb3 - - - - - -] Unable to reload_allocations dhcp for 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap372a9673-6c not found in namespace qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448.
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap372a9673-6c not found in namespace qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448.
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent 
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.139 265541 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 23 09:56:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:23Z|00167|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:23.310 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.400 265541 INFO neutron.agent.dhcp.agent [None req-dee707e2-4fcb-4925-9852-1b2d562dd6b0 - - - - - -] All active networks have been fetched through RPC.
Feb 23 09:56:23 np0005626463.localdomain dnsmasq[311917]: exiting on receipt of SIGTERM
Feb 23 09:56:23 np0005626463.localdomain podman[311965]: 2026-02-23 09:56:23.546888358 +0000 UTC m=+0.042882956 container kill db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: libpod-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:56:23 np0005626463.localdomain podman[311980]: 2026-02-23 09:56:23.626128006 +0000 UTC m=+0.061878663 container died db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:23 np0005626463.localdomain podman[311980]: 2026-02-23 09:56:23.679403352 +0000 UTC m=+0.115153959 container remove db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:23 np0005626463.localdomain podman[311991]: 2026-02-23 09:56:23.693847518 +0000 UTC m=+0.119909735 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:56:23 np0005626463.localdomain podman[311991]: 2026-02-23 09:56:23.705272902 +0000 UTC m=+0.131335069 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:56:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.706 265541 INFO neutron.agent.dhcp.agent [None req-96683e3f-c935-4255-96f4-59816f0e3683 - - - - - -] Synchronizing state complete
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-6fd10a276b25a13959ac50d6133201d49913bb2a1ba87afd357e7af4f3a8b1bb-merged.mount: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d05d93df9\x2d29e9\x2d48b0\x2d9b9e\x2d8c7a4eaa7448.mount: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain systemd[1]: libpod-conmon-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope: Deactivated successfully.
Feb 23 09:56:23 np0005626463.localdomain ceph-mon[294160]: pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 682 B/s wr, 16 op/s
Feb 23 09:56:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2141240059' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:56:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:25 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:25.537 2 INFO neutron.agent.securitygroups_rpc [None req-7a66e0e4-687a-49c0-b7a3-b39df7d3f4b0 70f605e811404c4bb9fe49c02ce24bf3 a2f9492758b148768734fafb039e58db - - default default] Security group member updated ['a4d30edc-cb55-4200-8dd2-93ea986a3cd5']
Feb 23 09:56:25 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:56:25 np0005626463.localdomain podman[312029]: 2026-02-23 09:56:25.912339519 +0000 UTC m=+0.081073786 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vcs-type=git, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container)
Feb 23 09:56:25 np0005626463.localdomain podman[312029]: 2026-02-23 09:56:25.92822739 +0000 UTC m=+0.096961677 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git)
Feb 23 09:56:25 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:56:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:26.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:26.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:26 np0005626463.localdomain ceph-mon[294160]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:26Z|00168|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:26 np0005626463.localdomain sshd[312049]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:56:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:26.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:26 np0005626463.localdomain sshd[312049]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:56:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:56:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:56:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/194672215' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.523 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.601 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.602 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.829 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.830 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11384MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:27.920 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:56:27 np0005626463.localdomain ceph-mon[294160]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/194672215' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:56:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2335027054' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:28.351 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:56:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:28.358 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:56:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:28.378 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:56:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:28.380 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:56:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:28.381 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:56:28 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:28.788 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:09:8b 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85b3f5b6-1b29-412a-9c40-de284e163599, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b34126a6-b855-4d0f-af32-431b42ec89f3) old=Port_Binding(mac=['fa:16:3e:23:09:8b 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:28 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:28.790 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b34126a6-b855-4d0f-af32-431b42ec89f3 in datapath f6f90c3e-e9fc-4b4d-8000-6715492c6006 updated
Feb 23 09:56:28 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:28.793 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6f90c3e-e9fc-4b4d-8000-6715492c6006, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:56:28 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:28.794 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3782a614-483a-4527-8690-3d7c631bd7eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2335027054' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:56:29 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:29.077 265541 INFO neutron.agent.linux.ip_lib [None req-9053744e-16fe-4544-acb7-09db1134d43f - - - - - -] Device tap82d3eaca-cb cannot be used as it has no MAC address
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.134 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:29 np0005626463.localdomain kernel: device tap82d3eaca-cb entered promiscuous mode
Feb 23 09:56:29 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840589.1421] manager: (tap82d3eaca-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.143 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:29 np0005626463.localdomain systemd-udevd[312105]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.148 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:29 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:29Z|00169|binding|INFO|Claiming lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 for this chassis.
Feb 23 09:56:29 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:29Z|00170|binding|INFO|82d3eaca-cb68-45a9-bf57-61ec3eca3d02: Claiming unknown
Feb 23 09:56:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:29.163 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=82d3eaca-cb68-45a9-bf57-61ec3eca3d02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:29.165 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 bound to our chassis
Feb 23 09:56:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:29.168 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:29.169 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[35d603bf-0982-44fc-8443-55d39408bfc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:29Z|00171|binding|INFO|Setting lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 ovn-installed in OVS
Feb 23 09:56:29 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:29Z|00172|binding|INFO|Setting lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 up in Southbound
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:29.251 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:30 np0005626463.localdomain ceph-mon[294160]: pgmap v206: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:30 np0005626463.localdomain podman[312176]: 
Feb 23 09:56:30 np0005626463.localdomain podman[312176]: 2026-02-23 09:56:30.069211777 +0000 UTC m=+0.092214360 container create b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:56:30 np0005626463.localdomain systemd[1]: Started libpod-conmon-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope.
Feb 23 09:56:30 np0005626463.localdomain podman[312176]: 2026-02-23 09:56:30.025620771 +0000 UTC m=+0.048623394 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:30 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:30 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acad2e7ba45657c1d140e9cfc4440102975b73a8b85654d17bed266ce7bbe0a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:30 np0005626463.localdomain podman[312176]: 2026-02-23 09:56:30.159991052 +0000 UTC m=+0.182993635 container init b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 23 09:56:30 np0005626463.localdomain podman[312176]: 2026-02-23 09:56:30.169845606 +0000 UTC m=+0.192848189 container start b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: started, version 2.85 cachesize 150
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: DNS service limited to local subnets
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: warning: no upstream servers configured
Feb 23 09:56:30 np0005626463.localdomain dnsmasq-dhcp[312194]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses
Feb 23 09:56:30 np0005626463.localdomain dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host
Feb 23 09:56:30 np0005626463.localdomain dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts
Feb 23 09:56:30 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:30.311 265541 INFO neutron.agent.dhcp.agent [None req-1ace6790-003c-4f84-8ec5-da0bdccc3811 - - - - - -] DHCP configuration for ports {'90d73603-9275-4864-aac8-38669974f0c1'} is completed
Feb 23 09:56:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:30.382 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:56:30 np0005626463.localdomain dnsmasq[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses
Feb 23 09:56:30 np0005626463.localdomain dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host
Feb 23 09:56:30 np0005626463.localdomain dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts
Feb 23 09:56:30 np0005626463.localdomain podman[312210]: 2026-02-23 09:56:30.493359562 +0000 UTC m=+0.058045025 container kill b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 09:56:30 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:30.763 265541 INFO neutron.agent.dhcp.agent [None req-d36b9e67-9dda-46bc-8595-5ffd7018ed96 - - - - - -] DHCP configuration for ports {'82d3eaca-cb68-45a9-bf57-61ec3eca3d02', '90d73603-9275-4864-aac8-38669974f0c1'} is completed
Feb 23 09:56:30 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:30.909 2 INFO neutron.agent.securitygroups_rpc [None req-51f4eed0-aded-49bd-977f-d8680aeec69e 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:31 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:31.239 2 INFO neutron.agent.securitygroups_rpc [None req-2890ed0d-a00f-4342-b403-59e82c71dfe3 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:31 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:31Z|00173|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:31.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:32 np0005626463.localdomain ceph-mon[294160]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:32Z|00174|binding|INFO|Removing iface tap82d3eaca-cb ovn-installed in OVS
Feb 23 09:56:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:32.436 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 567e9ecf-e387-4a72-9734-88b8c159914d with type ""
Feb 23 09:56:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:32Z|00175|binding|INFO|Removing lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 ovn-installed in OVS
Feb 23 09:56:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:32.437 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=82d3eaca-cb68-45a9-bf57-61ec3eca3d02) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:32.439 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 unbound from our chassis
Feb 23 09:56:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:32.443 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ad34212-24f1-4cd3-b44f-f6713c550041, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:56:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:32.444 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:32.444 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[75c660df-ce9c-43c8-a337-95f303c52707]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:32 np0005626463.localdomain dnsmasq[312194]: exiting on receipt of SIGTERM
Feb 23 09:56:32 np0005626463.localdomain podman[312246]: 2026-02-23 09:56:32.554654265 +0000 UTC m=+0.057868298 container kill b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:56:32 np0005626463.localdomain systemd[1]: libpod-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope: Deactivated successfully.
Feb 23 09:56:32 np0005626463.localdomain podman[312260]: 2026-02-23 09:56:32.623849284 +0000 UTC m=+0.051796531 container died b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:56:32 np0005626463.localdomain systemd[1]: tmp-crun.YTYRfG.mount: Deactivated successfully.
Feb 23 09:56:32 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:32 np0005626463.localdomain podman[312260]: 2026-02-23 09:56:32.665734598 +0000 UTC m=+0.093681815 container remove b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 09:56:32 np0005626463.localdomain systemd[1]: libpod-conmon-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope: Deactivated successfully.
Feb 23 09:56:32 np0005626463.localdomain kernel: device tap82d3eaca-cb left promiscuous mode
Feb 23 09:56:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:32.680 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:32.689 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.705 265541 INFO neutron.agent.dhcp.agent [None req-96683e3f-c935-4255-96f4-59816f0e3683 - - - - - -] Synchronizing state
Feb 23 09:56:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:32.915 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.962 265541 INFO neutron.agent.dhcp.agent [None req-274e03b2-f74d-44f1-b54c-1efda683bbca - - - - - -] All active networks have been fetched through RPC.
Feb 23 09:56:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.963 265541 INFO neutron.agent.dhcp.agent [-] Starting network 2ad34212-24f1-4cd3-b44f-f6713c550041 dhcp configuration
Feb 23 09:56:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.966 265541 INFO neutron.agent.dhcp.agent [-] Starting network f240f7f9-ece5-4389-81ed-fec84e1bb5f7 dhcp configuration
Feb 23 09:56:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.967 265541 INFO neutron.agent.dhcp.agent [-] Finished network f240f7f9-ece5-4389-81ed-fec84e1bb5f7 dhcp configuration
Feb 23 09:56:32 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:32.980 2 INFO neutron.agent.securitygroups_rpc [None req-5d3e7d76-e87c-4157-8484-7f31d3f7ba7b 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:32 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:32.988 2 INFO neutron.agent.securitygroups_rpc [None req-f8235bf5-0ea1-4901-953e-20a5808b9d67 70f605e811404c4bb9fe49c02ce24bf3 a2f9492758b148768734fafb039e58db - - default default] Security group member updated ['a4d30edc-cb55-4200-8dd2-93ea986a3cd5']
Feb 23 09:56:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-acad2e7ba45657c1d140e9cfc4440102975b73a8b85654d17bed266ce7bbe0a8-merged.mount: Deactivated successfully.
Feb 23 09:56:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:33.644 265541 INFO neutron.agent.linux.ip_lib [None req-14238360-846a-4748-b5df-245855e99c07 - - - - - -] Device tapf95ad720-f8 cannot be used as it has no MAC address
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.706 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain kernel: device tapf95ad720-f8 entered promiscuous mode
Feb 23 09:56:33 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840593.7135] manager: (tapf95ad720-f8): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Feb 23 09:56:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:33Z|00176|binding|INFO|Claiming lport f95ad720-f864-4c58-8f6b-16288980b877 for this chassis.
Feb 23 09:56:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:33Z|00177|binding|INFO|f95ad720-f864-4c58-8f6b-16288980b877: Claiming unknown
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.714 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain systemd-udevd[312295]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.722 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:33Z|00178|binding|INFO|Setting lport f95ad720-f864-4c58-8f6b-16288980b877 ovn-installed in OVS
Feb 23 09:56:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:33Z|00179|binding|INFO|Setting lport f95ad720-f864-4c58-8f6b-16288980b877 up in Southbound
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.726 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:33.723 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=f95ad720-f864-4c58-8f6b-16288980b877) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:33.726 163572 INFO neutron.agent.ovn.metadata.agent [-] Port f95ad720-f864-4c58-8f6b-16288980b877 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 bound to our chassis
Feb 23 09:56:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:33.728 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:33.729 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5ff5af-413c-4b35-b5bd-a04bd4247ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.754 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapf95ad720-f8: No such device
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:33.821 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:34 np0005626463.localdomain ceph-mon[294160]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:34 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:34.327 2 INFO neutron.agent.securitygroups_rpc [None req-3413d0d6-8a0f-497e-801e-d0383982e452 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:34 np0005626463.localdomain podman[312366]: 
Feb 23 09:56:34 np0005626463.localdomain podman[312366]: 2026-02-23 09:56:34.592209297 +0000 UTC m=+0.087680129 container create f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:56:34 np0005626463.localdomain systemd[1]: Started libpod-conmon-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope.
Feb 23 09:56:34 np0005626463.localdomain systemd[1]: tmp-crun.PFO2Gh.mount: Deactivated successfully.
Feb 23 09:56:34 np0005626463.localdomain podman[312366]: 2026-02-23 09:56:34.548647011 +0000 UTC m=+0.044117863 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:34 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f03d05a0e5a4232a0d0df03e37aa60cfe00098407af5ed8503081a6f008fa81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:34 np0005626463.localdomain podman[312366]: 2026-02-23 09:56:34.666260935 +0000 UTC m=+0.161731777 container init f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:56:34 np0005626463.localdomain podman[312366]: 2026-02-23 09:56:34.680223956 +0000 UTC m=+0.175694798 container start f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: started, version 2.85 cachesize 150
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: DNS service limited to local subnets
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: warning: no upstream servers configured
Feb 23 09:56:34 np0005626463.localdomain dnsmasq-dhcp[312385]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses
Feb 23 09:56:34 np0005626463.localdomain dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host
Feb 23 09:56:34 np0005626463.localdomain dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts
Feb 23 09:56:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.739 265541 INFO neutron.agent.dhcp.agent [None req-14238360-846a-4748-b5df-245855e99c07 - - - - - -] Finished network 2ad34212-24f1-4cd3-b44f-f6713c550041 dhcp configuration
Feb 23 09:56:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.740 265541 INFO neutron.agent.dhcp.agent [None req-274e03b2-f74d-44f1-b54c-1efda683bbca - - - - - -] Synchronizing state complete
Feb 23 09:56:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.823 265541 INFO neutron.agent.dhcp.agent [None req-38eb3c3a-4c0d-4b0e-a13e-9cd621e7421d - - - - - -] DHCP configuration for ports {'90d73603-9275-4864-aac8-38669974f0c1'} is completed
Feb 23 09:56:34 np0005626463.localdomain dnsmasq[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses
Feb 23 09:56:34 np0005626463.localdomain dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host
Feb 23 09:56:34 np0005626463.localdomain dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts
Feb 23 09:56:34 np0005626463.localdomain podman[312402]: 2026-02-23 09:56:34.909621033 +0000 UTC m=+0.060547521 container kill f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:56:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.926 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.243 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 7a034147-002e-4bd4-bc87-146a85b374e8 with type ""
Feb 23 09:56:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:35Z|00180|binding|INFO|Removing iface tapf95ad720-f8 ovn-installed in OVS
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.244 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=f95ad720-f864-4c58-8f6b-16288980b877) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.247 163572 INFO neutron.agent.ovn.metadata.agent [-] Port f95ad720-f864-4c58-8f6b-16288980b877 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 unbound from our chassis
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.250 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:35Z|00181|binding|INFO|Removing lport f95ad720-f864-4c58-8f6b-16288980b877 ovn-installed in OVS
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.251 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[196ea26e-b02e-4694-971c-4194d867ae50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:35 np0005626463.localdomain dnsmasq[312385]: exiting on receipt of SIGTERM
Feb 23 09:56:35 np0005626463.localdomain podman[312441]: 2026-02-23 09:56:35.289326775 +0000 UTC m=+0.061161951 container kill f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: libpod-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain podman[312456]: 2026-02-23 09:56:35.367392567 +0000 UTC m=+0.061534882 container died f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 09:56:35 np0005626463.localdomain podman[312456]: 2026-02-23 09:56:35.397019942 +0000 UTC m=+0.091162217 container cleanup f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: libpod-conmon-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:35.442 2 INFO neutron.agent.securitygroups_rpc [None req-aa725386-7116-408a-a619-1f9f73c010a8 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:35 np0005626463.localdomain podman[312457]: 2026-02-23 09:56:35.44551574 +0000 UTC m=+0.133950139 container remove f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:56:35 np0005626463.localdomain kernel: device tapf95ad720-f8 left promiscuous mode
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:35 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:35.462 2 INFO neutron.agent.securitygroups_rpc [None req-ff1e8db8-3064-40a0-abdd-bddb9c09e449 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.478 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:35 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:35.510 265541 INFO neutron.agent.dhcp.agent [None req-33006b32-0a26-43fd-a576-94e8c8a0331e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:35Z|00182|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: tmp-crun.BbSzKM.mount: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-1f03d05a0e5a4232a0d0df03e37aa60cfe00098407af5ed8503081a6f008fa81-merged.mount: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d2ad34212\x2d24f1\x2d4cd3\x2db44f\x2df6713c550041.mount: Deactivated successfully.
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.600 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:35 np0005626463.localdomain podman[312501]: 2026-02-23 09:56:35.747140359 +0000 UTC m=+0.057305211 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:56:35 np0005626463.localdomain dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 0 addresses
Feb 23 09:56:35 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host
Feb 23 09:56:35 np0005626463.localdomain dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts
Feb 23 09:56:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:35Z|00183|binding|INFO|Releasing lport fea38170-0626-427b-8a36-b82b8e008ab6 from this chassis (sb_readonly=0)
Feb 23 09:56:35 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:35Z|00184|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 down in Southbound
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:35 np0005626463.localdomain kernel: device tapfea38170-06 left promiscuous mode
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.976 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5632ff1108264def864ca9b5473cb716', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cca3f636-88c2-4a23-a28f-aa045d27b076, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=fea38170-0626-427b-8a36-b82b8e008ab6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.978 163572 INFO neutron.agent.ovn.metadata.agent [-] Port fea38170-0626-427b-8a36-b82b8e008ab6 in datapath 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 unbound from our chassis
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.982 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:56:35 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:35.983 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e8857760-669a-4443-a1a1-6ff0f740ca92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:35.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:36 np0005626463.localdomain ceph-mon[294160]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:36 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:36.513 2 INFO neutron.agent.securitygroups_rpc [None req-1fda4918-c68f-4c9d-a41e-c71c19c42e64 f6ed429d4dee4c5abef411f5952801ef 7760b87546484c7693fd48206e06d3f8 - - default default] Security group member updated ['1a09a3fa-6a99-44c4-8684-508fe117a320']
Feb 23 09:56:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:56:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:56:36 np0005626463.localdomain systemd[1]: tmp-crun.k7X80M.mount: Deactivated successfully.
Feb 23 09:56:36 np0005626463.localdomain podman[312525]: 2026-02-23 09:56:36.923159993 +0000 UTC m=+0.095212823 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:56:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:36 np0005626463.localdomain podman[312525]: 2026-02-23 09:56:36.962529419 +0000 UTC m=+0.134582209 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:56:36 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:56:37 np0005626463.localdomain podman[312524]: 2026-02-23 09:56:37.022400858 +0000 UTC m=+0.197000727 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller)
Feb 23 09:56:37 np0005626463.localdomain podman[312524]: 2026-02-23 09:56:37.121260863 +0000 UTC m=+0.295860762 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 09:56:37 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:56:37 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:37.667 2 INFO neutron.agent.securitygroups_rpc [None req-0f5e541e-74ab-492f-8902-38d7300ec53b f6ed429d4dee4c5abef411f5952801ef 7760b87546484c7693fd48206e06d3f8 - - default default] Security group member updated ['1a09a3fa-6a99-44c4-8684-508fe117a320']
Feb 23 09:56:37 np0005626463.localdomain systemd[1]: tmp-crun.IiNlwB.mount: Deactivated successfully.
Feb 23 09:56:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:37.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:37 np0005626463.localdomain ceph-mon[294160]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:39.084 2 INFO neutron.agent.securitygroups_rpc [None req-0564baea-5d79-415c-93b0-64b1a1c7383f d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:39 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:39Z|00185|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:39.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:56:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:56:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:56:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1"
Feb 23 09:56:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:56:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1"
Feb 23 09:56:39 np0005626463.localdomain dnsmasq[310490]: exiting on receipt of SIGTERM
Feb 23 09:56:39 np0005626463.localdomain podman[312589]: 2026-02-23 09:56:39.689982756 +0000 UTC m=+0.060289904 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:56:39 np0005626463.localdomain systemd[1]: libpod-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope: Deactivated successfully.
Feb 23 09:56:39 np0005626463.localdomain podman[312601]: 2026-02-23 09:56:39.756737387 +0000 UTC m=+0.055635239 container died 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:56:39 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:39 np0005626463.localdomain podman[312601]: 2026-02-23 09:56:39.79048329 +0000 UTC m=+0.089381112 container cleanup 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:56:39 np0005626463.localdomain systemd[1]: libpod-conmon-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope: Deactivated successfully.
Feb 23 09:56:39 np0005626463.localdomain podman[312608]: 2026-02-23 09:56:39.84065743 +0000 UTC m=+0.126824789 container remove 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:39.873 265541 INFO neutron.agent.dhcp.agent [None req-2c69bfeb-4dd0-4d65-83c8-716dcc5d0958 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:39.874 265541 INFO neutron.agent.dhcp.agent [None req-2c69bfeb-4dd0-4d65-83c8-716dcc5d0958 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:40 np0005626463.localdomain ceph-mon[294160]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-930a766cf5e4b8a84c31286e57382a4788939504b12f986029e04d96b0f3a126-merged.mount: Deactivated successfully.
Feb 23 09:56:40 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d5b164f5a\x2d6aae\x2d4898\x2da6ea\x2da1c77a8cf652.mount: Deactivated successfully.
Feb 23 09:56:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:56:41 np0005626463.localdomain podman[312629]: 2026-02-23 09:56:41.912552333 +0000 UTC m=+0.084813361 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:41 np0005626463.localdomain podman[312629]: 2026-02-23 09:56:41.927400921 +0000 UTC m=+0.099661919 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:56:41 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:56:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:41.987 2 INFO neutron.agent.securitygroups_rpc [None req-00c7bab6-3ec4-412e-b4e0-8dc31e0d8362 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:42 np0005626463.localdomain ceph-mon[294160]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:42.920 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:56:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:56:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:56:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:56:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:56:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:56:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:43.447 265541 INFO neutron.agent.linux.ip_lib [None req-c0a85ad2-4ddd-4097-ada4-a5dc6b3ee6e8 - - - - - -] Device tap556381c3-ac cannot be used as it has no MAC address
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain kernel: device tap556381c3-ac entered promiscuous mode
Feb 23 09:56:43 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840603.4749] manager: (tap556381c3-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.474 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:43Z|00186|binding|INFO|Claiming lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 for this chassis.
Feb 23 09:56:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:43Z|00187|binding|INFO|556381c3-ace4-4128-82b0-c33c8ad0e1c9: Claiming unknown
Feb 23 09:56:43 np0005626463.localdomain systemd-udevd[312658]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:43.494 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0aaf8a49-34b1-4108-a38e-a49be6b7ace1, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=556381c3-ace4-4128-82b0-c33c8ad0e1c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:43.495 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 556381c3-ace4-4128-82b0-c33c8ad0e1c9 in datapath 6d7119b8-7a22-4328-9884-4df90f2c3ebd bound to our chassis
Feb 23 09:56:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:43.497 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d7119b8-7a22-4328-9884-4df90f2c3ebd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:43 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:43.498 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f10fe368-7b53-4397-97ee-c9d8fb5a36f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.505 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:43Z|00188|binding|INFO|Setting lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 ovn-installed in OVS
Feb 23 09:56:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:43Z|00189|binding|INFO|Setting lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 up in Southbound
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.510 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap556381c3-ac: No such device
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.551 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:43.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:44 np0005626463.localdomain ceph-mon[294160]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:44 np0005626463.localdomain podman[312729]: 
Feb 23 09:56:44 np0005626463.localdomain podman[312729]: 2026-02-23 09:56:44.395000679 +0000 UTC m=+0.086549165 container create 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:44 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:56:44 np0005626463.localdomain systemd[1]: Started libpod-conmon-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope.
Feb 23 09:56:44 np0005626463.localdomain podman[312729]: 2026-02-23 09:56:44.354194409 +0000 UTC m=+0.045742925 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:44 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:44 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b266334acb0e55f0426e207375e16e3c1c5eaa3a9f7c324de8884ec556d7b30d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:44 np0005626463.localdomain podman[312742]: 2026-02-23 09:56:44.518122113 +0000 UTC m=+0.084030098 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 23 09:56:44 np0005626463.localdomain podman[312742]: 2026-02-23 09:56:44.523518849 +0000 UTC m=+0.089426804 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 23 09:56:44 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:56:44 np0005626463.localdomain podman[312729]: 2026-02-23 09:56:44.550828483 +0000 UTC m=+0.242376969 container init 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:44 np0005626463.localdomain podman[312729]: 2026-02-23 09:56:44.560502312 +0000 UTC m=+0.252050808 container start 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:44 np0005626463.localdomain dnsmasq[312766]: started, version 2.85 cachesize 150
Feb 23 09:56:44 np0005626463.localdomain dnsmasq[312766]: DNS service limited to local subnets
Feb 23 09:56:44 np0005626463.localdomain dnsmasq[312766]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:44 np0005626463.localdomain dnsmasq[312766]: warning: no upstream servers configured
Feb 23 09:56:44 np0005626463.localdomain dnsmasq-dhcp[312766]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:56:44 np0005626463.localdomain dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 0 addresses
Feb 23 09:56:44 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host
Feb 23 09:56:44 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts
Feb 23 09:56:45 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:45.031 265541 INFO neutron.agent.dhcp.agent [None req-6ee0dc6d-6a24-4d3b-b2fd-e00b24f3197f - - - - - -] DHCP configuration for ports {'771090fe-1a3e-436c-a027-4e2b743c323c'} is completed
Feb 23 09:56:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:45.650 2 INFO neutron.agent.securitygroups_rpc [None req-f79d3420-cd8f-4700-a586-57c3375d8a5c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:45 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:45.694 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:45Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937c310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937c400>], id=36514155-4ae9-4996-bb88-a40fb9ad6bcf, ip_allocation=immediate, mac_address=fa:16:3e:92:f1:55, name=tempest-PortsTestJSON-1106137784, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:40Z, description=, dns_domain=, id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1304508516, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1405, status=ACTIVE, subnets=['ebe8b110-7353-4fff-bc4d-dbab752dab8b'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:42Z, vlan_transparent=None, network_id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1440, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:45Z on network 6d7119b8-7a22-4328-9884-4df90f2c3ebd
Feb 23 09:56:45 np0005626463.localdomain dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 1 addresses
Feb 23 09:56:45 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host
Feb 23 09:56:45 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts
Feb 23 09:56:45 np0005626463.localdomain podman[312784]: 2026-02-23 09:56:45.90595029 +0000 UTC m=+0.065001959 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 09:56:46 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:46.121 265541 INFO neutron.agent.dhcp.agent [None req-1a048b78-c850-4495-8445-cc78629209cc - - - - - -] DHCP configuration for ports {'36514155-4ae9-4996-bb88-a40fb9ad6bcf'} is completed
Feb 23 09:56:46 np0005626463.localdomain ceph-mon[294160]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:46 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:46.317 2 INFO neutron.agent.securitygroups_rpc [None req-23858e93-d857-4321-b908-702c515a7b92 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:46 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:46.605 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294bff10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294bf730>], id=7a454d8b-c57f-41ac-a163-5c4d28169a90, ip_allocation=immediate, mac_address=fa:16:3e:18:a1:a7, name=tempest-PortsTestJSON-341249040, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:40Z, description=, dns_domain=, id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1304508516, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1405, status=ACTIVE, subnets=['ebe8b110-7353-4fff-bc4d-dbab752dab8b'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:42Z, vlan_transparent=None, network_id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1442, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:46Z on network 6d7119b8-7a22-4328-9884-4df90f2c3ebd
Feb 23 09:56:46 np0005626463.localdomain dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 2 addresses
Feb 23 09:56:46 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host
Feb 23 09:56:46 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts
Feb 23 09:56:46 np0005626463.localdomain podman[312832]: 2026-02-23 09:56:46.849946295 +0000 UTC m=+0.058179947 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:56:46 np0005626463.localdomain dnsmasq[310749]: exiting on receipt of SIGTERM
Feb 23 09:56:46 np0005626463.localdomain systemd[1]: libpod-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope: Deactivated successfully.
Feb 23 09:56:46 np0005626463.localdomain podman[312849]: 2026-02-23 09:56:46.908823554 +0000 UTC m=+0.069234799 container kill 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:46 np0005626463.localdomain podman[312869]: 2026-02-23 09:56:46.984998478 +0000 UTC m=+0.064466672 container died 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 09:56:47 np0005626463.localdomain systemd[1]: tmp-crun.BtYp9v.mount: Deactivated successfully.
Feb 23 09:56:47 np0005626463.localdomain podman[312869]: 2026-02-23 09:56:47.028337287 +0000 UTC m=+0.107805441 container cleanup 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:47 np0005626463.localdomain systemd[1]: libpod-conmon-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope: Deactivated successfully.
Feb 23 09:56:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.101 265541 INFO neutron.agent.dhcp.agent [None req-3fff2107-a8bf-49c6-a1da-5c55c8e1ef8b - - - - - -] DHCP configuration for ports {'7a454d8b-c57f-41ac-a163-5c4d28169a90'} is completed
Feb 23 09:56:47 np0005626463.localdomain podman[312871]: 2026-02-23 09:56:47.11616207 +0000 UTC m=+0.183437118 container remove 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:56:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:47.164 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:47 np0005626463.localdomain kernel: device tap5e31c1f9-f2 left promiscuous mode
Feb 23 09:56:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:47Z|00190|binding|INFO|Releasing lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a from this chassis (sb_readonly=0)
Feb 23 09:56:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:47Z|00191|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a down in Southbound
Feb 23 09:56:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:47.173 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0b7f1d9-1471-4000-b583-343082500ed7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:47.175 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a in datapath b207e42e-4d3c-43ce-b855-2d1a36797be6 unbound from our chassis
Feb 23 09:56:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:47.177 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b207e42e-4d3c-43ce-b855-2d1a36797be6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:47.178 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc75a83-2689-4bd8-8c87-e859bdd4583b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:47.185 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:47.187 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.212 265541 INFO neutron.agent.dhcp.agent [None req-e7b3f8fc-1587-45d7-bebb-80552091a52c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.438 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:47Z|00192|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:47.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:47.754 2 INFO neutron.agent.securitygroups_rpc [None req-531eb3a8-b1e9-4d08-88ed-f2a5323c2530 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-85d2c0758da427144c001a7e97c7359bf4b87ae2a86e80ab66c7f916f65db929-merged.mount: Deactivated successfully.
Feb 23 09:56:47 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:47 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2db207e42e\x2d4d3c\x2d43ce\x2db855\x2d2d1a36797be6.mount: Deactivated successfully.
Feb 23 09:56:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:47.923 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:47 np0005626463.localdomain ceph-mon[294160]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:48 np0005626463.localdomain dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 1 addresses
Feb 23 09:56:48 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host
Feb 23 09:56:48 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts
Feb 23 09:56:48 np0005626463.localdomain podman[312917]: 2026-02-23 09:56:48.038400924 +0000 UTC m=+0.067758656 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 23 09:56:48 np0005626463.localdomain systemd[1]: tmp-crun.epo7Gj.mount: Deactivated successfully.
Feb 23 09:56:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:48.557 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:56:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:56:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:56:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:49.054 2 INFO neutron.agent.securitygroups_rpc [None req-7c4570b4-c3dc-480d-b91f-3f4320c93168 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:49 np0005626463.localdomain dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 0 addresses
Feb 23 09:56:49 np0005626463.localdomain podman[312952]: 2026-02-23 09:56:49.379993272 +0000 UTC m=+0.059415017 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:49 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host
Feb 23 09:56:49 np0005626463.localdomain dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts
Feb 23 09:56:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:49.883 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:49.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:49.884 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:56:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:49.980 2 INFO neutron.agent.securitygroups_rpc [None req-df38b538-2f68-43d4-a4e5-1e14d933a2a9 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']
Feb 23 09:56:50 np0005626463.localdomain ceph-mon[294160]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.359 265541 INFO neutron.agent.linux.ip_lib [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Device tap6b8e941b-e3 cannot be used as it has no MAC address
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.383 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain kernel: device tap6b8e941b-e3 entered promiscuous mode
Feb 23 09:56:50 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840610.3920] manager: (tap6b8e941b-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00193|binding|INFO|Claiming lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 for this chassis.
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00194|binding|INFO|6b8e941b-e318-43c8-8da1-efc8c08d0ac8: Claiming unknown
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.392 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain systemd-udevd[313011]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.407 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8630a66fd9f41828b0bd2cf93b5956f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c6d3781-7aae-4474-bf2c-0e950a13f37c, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6b8e941b-e318-43c8-8da1-efc8c08d0ac8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.408 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 in datapath 2fe91a2a-5b02-4767-89ca-7f8954141d90 bound to our chassis
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.410 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fe91a2a-5b02-4767-89ca-7f8954141d90 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:50 np0005626463.localdomain dnsmasq[312766]: exiting on receipt of SIGTERM
Feb 23 09:56:50 np0005626463.localdomain podman[312993]: 2026-02-23 09:56:50.411363666 +0000 UTC m=+0.075830173 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.411 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[2001e787-29d8-408f-9ccf-9678ed1fb6cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:50 np0005626463.localdomain systemd[1]: libpod-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope: Deactivated successfully.
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00195|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 ovn-installed in OVS
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00196|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 up in Southbound
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.443 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.446 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.492 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain podman[313015]: 2026-02-23 09:56:50.497749676 +0000 UTC m=+0.066608490 container died 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.519 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain podman[313015]: 2026-02-23 09:56:50.52962343 +0000 UTC m=+0.098482184 container cleanup 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:56:50 np0005626463.localdomain systemd[1]: libpod-conmon-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope: Deactivated successfully.
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00197|binding|INFO|Removing iface tap556381c3-ac ovn-installed in OVS
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.546 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5c3fc2f9-1f90-4ed3-b1c0-73af4bb6bf5f with type ""
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00198|binding|INFO|Removing lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 ovn-installed in OVS
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.548 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0aaf8a49-34b1-4108-a38e-a49be6b7ace1, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=556381c3-ace4-4128-82b0-c33c8ad0e1c9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.550 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 556381c3-ace4-4128-82b0-c33c8ad0e1c9 in datapath 6d7119b8-7a22-4328-9884-4df90f2c3ebd unbound from our chassis
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.553 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d7119b8-7a22-4328-9884-4df90f2c3ebd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:56:50 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:50.553 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f1f984-ad83-49ab-8a46-57f63cccd672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.554 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain podman[313017]: 2026-02-23 09:56:50.57302823 +0000 UTC m=+0.126369405 container remove 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:56:50 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:50.589 2 INFO neutron.agent.securitygroups_rpc [None req-8eb799ea-d360-4da4-8f9a-9902d730dcc7 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']
Feb 23 09:56:50 np0005626463.localdomain kernel: device tap556381c3-ac left promiscuous mode
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.625 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.650 265541 INFO neutron.agent.dhcp.agent [None req-30f6c4db-23e3-4788-8b95-e85b5b625129 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.769 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:50Z|00199|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:50.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:51 np0005626463.localdomain podman[313096]: 
Feb 23 09:56:51 np0005626463.localdomain podman[313096]: 2026-02-23 09:56:51.368992963 +0000 UTC m=+0.087450054 container create a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:56:51 np0005626463.localdomain systemd[1]: Started libpod-conmon-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope.
Feb 23 09:56:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b266334acb0e55f0426e207375e16e3c1c5eaa3a9f7c324de8884ec556d7b30d-merged.mount: Deactivated successfully.
Feb 23 09:56:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:51 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d6d7119b8\x2d7a22\x2d4328\x2d9884\x2d4df90f2c3ebd.mount: Deactivated successfully.
Feb 23 09:56:51 np0005626463.localdomain podman[313096]: 2026-02-23 09:56:51.325815168 +0000 UTC m=+0.044272269 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:56:51 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:56:51 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb5a3a963be7354e78652f4a92c2c34304d37c406133dfb79b7eec719c4f26a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:56:51 np0005626463.localdomain podman[313096]: 2026-02-23 09:56:51.452029258 +0000 UTC m=+0.170486339 container init a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:56:51 np0005626463.localdomain podman[313096]: 2026-02-23 09:56:51.462152311 +0000 UTC m=+0.180609402 container start a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: started, version 2.85 cachesize 150
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: DNS service limited to local subnets
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: warning: no upstream servers configured
Feb 23 09:56:51 np0005626463.localdomain dnsmasq-dhcp[313115]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 0 addresses
Feb 23 09:56:51 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:51 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.524 265541 INFO neutron.agent.dhcp.agent [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:49Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829303b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829303b20>], id=4f99192a-5334-414e-8f15-7eeaaea3cb3b, ip_allocation=immediate, mac_address=fa:16:3e:93:f2:6b, name=tempest-ExtraDHCPOptionsIpV6TestJSON-44067334, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:47Z, description=, dns_domain=, id=2fe91a2a-5b02-4767-89ca-7f8954141d90, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-581439413, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1450, status=ACTIVE, subnets=['ffdf14ca-f105-4a70-9464-7924b5a5f427'], tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z, vlan_transparent=None, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1459, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.597 265541 INFO neutron.agent.dhcp.agent [None req-11ce90fe-c87a-4973-91a4-46f144b76ce1 - - - - - -] DHCP configuration for ports {'21e9c9a4-2bce-498a-92d2-ed020111f0ed'} is completed
Feb 23 09:56:51 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses
Feb 23 09:56:51 np0005626463.localdomain podman[313135]: 2026-02-23 09:56:51.719602265 +0000 UTC m=+0.062743920 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 09:56:51 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:51 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.865 265541 INFO neutron.agent.dhcp.agent [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:50Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294344f0>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c4d5b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f28294341f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f2829434730>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c4de20>], id=d84e91b2-8278-4035-a47c-6d8d258897cd, ip_allocation=immediate, mac_address=fa:16:3e:15:53:ce, name=tempest-ExtraDHCPOptionsIpV6TestJSON-7708552, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:47Z, description=, dns_domain=, id=2fe91a2a-5b02-4767-89ca-7f8954141d90, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-581439413, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1450, status=ACTIVE, subnets=['ffdf14ca-f105-4a70-9464-7924b5a5f427'], tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z, vlan_transparent=None, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1460, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:50Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.882 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.883 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.884 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:51.886 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:56:51 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:51.917 2 INFO neutron.agent.securitygroups_rpc [None req-b1440841-9f18-4923-88ab-a2d6a50a7349 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']
Feb 23 09:56:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.948 265541 INFO neutron.agent.dhcp.agent [None req-d0f0bbc0-c431-4844-b3ae-9fea310134a7 - - - - - -] DHCP configuration for ports {'4f99192a-5334-414e-8f15-7eeaaea3cb3b'} is completed
Feb 23 09:56:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:52 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 2 addresses
Feb 23 09:56:52 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:52 np0005626463.localdomain podman[313173]: 2026-02-23 09:56:52.061332713 +0000 UTC m=+0.058572581 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:56:52 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:52 np0005626463.localdomain ceph-mon[294160]: pgmap v217: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:52.271 265541 INFO neutron.agent.dhcp.agent [None req-ef822ec0-1efe-424b-b1c8-9db8cf35b866 - - - - - -] DHCP configuration for ports {'d84e91b2-8278-4035-a47c-6d8d258897cd'} is completed
Feb 23 09:56:52 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses
Feb 23 09:56:52 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:52 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:52 np0005626463.localdomain podman[313210]: 2026-02-23 09:56:52.423471971 +0000 UTC m=+0.060187711 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:56:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:52.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.114 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:49Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937c0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937caf0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f282937c2e0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f282937cfa0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937cb20>], id=4f99192a-5334-414e-8f15-7eeaaea3cb3b, ip_allocation=immediate, mac_address=fa:16:3e:93:f2:6b, name=tempest-new-port-name-2068423017, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1459, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:52Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90
Feb 23 09:56:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.130 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.131 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.131 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 23 09:56:53 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses
Feb 23 09:56:53 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:53 np0005626463.localdomain podman[313250]: 2026-02-23 09:56:53.304668696 +0000 UTC m=+0.060684876 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:56:53 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:53 np0005626463.localdomain systemd[1]: tmp-crun.qhvDwc.mount: Deactivated successfully.
Feb 23 09:56:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.525 265541 INFO neutron.agent.dhcp.agent [None req-234108c3-b184-4e9a-ae6d-ad34f2539da2 - - - - - -] DHCP configuration for ports {'4f99192a-5334-414e-8f15-7eeaaea3cb3b'} is completed
Feb 23 09:56:53 np0005626463.localdomain ceph-mon[294160]: pgmap v218: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:56:53 np0005626463.localdomain systemd[1]: tmp-crun.kj9Ul1.mount: Deactivated successfully.
Feb 23 09:56:53 np0005626463.localdomain podman[313272]: 2026-02-23 09:56:53.909895825 +0000 UTC m=+0.086809513 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:56:53 np0005626463.localdomain podman[313272]: 2026-02-23 09:56:53.920224603 +0000 UTC m=+0.097138271 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 09:56:53 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:56:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:54.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:55.381 2 INFO neutron.agent.securitygroups_rpc [None req-7f69dcdd-1ed9-444c-bbd4-b59ea7457a69 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']
Feb 23 09:56:55 np0005626463.localdomain podman[313312]: 2026-02-23 09:56:55.602755826 +0000 UTC m=+0.061171561 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:56:55 np0005626463.localdomain dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 0 addresses
Feb 23 09:56:55 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host
Feb 23 09:56:55 np0005626463.localdomain dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.142 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.147 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '182a5f29-179f-43ec-b134-057a9b3fac25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.143353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcbf0cdc-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '6194f0052505030bd3519d93021853ba22b0b090be6798160ace84ce9d8700a2'}]}, 'timestamp': '2026-02-23 09:56:56.148818', '_unique_id': '5a6c332814c944f08eb2473c599ae688'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1b78942-3f92-4b55-94d1-fab77d176ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.151945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc416b4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '8c142561203ec2b5fec368456ac2f750b06efca8d70e92d20e5c12128dc8a49d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.151945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc42f32-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '299ede49d855797d836b24e6a7a116e24df3e569c267cae52b84c1cc6ffcb855'}]}, 'timestamp': '2026-02-23 09:56:56.182391', '_unique_id': '24182d04d8634dceba034eb1237fe821'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61af4262-028f-40ca-be6b-688bc93c2fb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.185214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc60ece-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'b607776cc9865ecb4916baab60e1772f4460e66568bf88e174753882e925d928'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.185214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc6217a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '55aa50982b54a718da044b05df746454cd7625d466d627161612536571186656'}]}, 'timestamp': '2026-02-23 09:56:56.195133', '_unique_id': '251d2fa854d4417dbc8f19b163fb8e67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88ba2dd9-4868-4ef7-b497-76e2fec1936b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.197972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc6a424-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'ee7fd69b86bc7694fb5f8cec63c9fa3b047447938ef0c4a1793cf0bdd064694f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.197972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc6b554-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '38695524ae6a3da7519614866bd3233b01fdd105a9b278904b41ade224ca9d8d'}]}, 'timestamp': '2026-02-23 09:56:56.198930', '_unique_id': '35e18140c057497d94a7cc6191560aba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a055a7d7-4096-4049-9960-4ccf233328b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.201568', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc73600-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '23f0088a78569f6a1e54853b5e39d6003731fa0db7cd8e473ccfa173b223fd21'}]}, 'timestamp': '2026-02-23 09:56:56.202361', '_unique_id': '5837282362cf4d1bad08183a1f1849b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb10808a-6306-4a91-8713-1a3cb86bf505', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.205586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc7c87c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '6babd20c1e088b50735303df2cbc9219990b19a333379d8fac875316b3b762a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.205586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc7d4b6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '96cb1faa2cb55abd599cb6acc85559f04e3b0e935899c22be1e7071f74aacd5c'}]}, 'timestamp': '2026-02-23 09:56:56.206194', '_unique_id': 'f01b0785c0314633acfb3f2bcd20ee58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e691f2d7-88db-435b-91e3-72a62913a6ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.207947', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc824ca-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '8ea2b04db810df805269e6cd833e364a288d475993ef15ecf6e1c07907a43f7f'}]}, 'timestamp': '2026-02-23 09:56:56.208263', '_unique_id': 'e4f34c3c52fd461cbe36b73109f3476b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.209 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a077978-79a7-47b8-a8b7-56b7d019bf94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.209694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc868d6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '64371352c826cba8e0707c00c36567a52eb7f983ba30b98690f26186db2542aa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.209694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc8756a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'cf40f1b50719a6a354cef93960760b7b0d366e99fdaa5ff07f70d74481f65756'}]}, 'timestamp': '2026-02-23 09:56:56.210311', '_unique_id': '9f9d378e2a2e4f158c4cb9dcb33693b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.212 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '632651f9-5f27-4ccc-b991-85b912c06212', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.212223', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc8ce98-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '1f50af63fbbebec0ae311e9a7c1d40f34400c9a5f883ebccd56541f263857d0b'}]}, 'timestamp': '2026-02-23 09:56:56.212614', '_unique_id': '83bb708883f34e889df3d297ee222d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceph-mon[294160]: pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e5269a2-9a9e-4598-85c3-71c96e48621b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:56:56.214110', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fccb94c0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.419708015, 'message_signature': 'ed77c9f4a29b5952c95b0ccd98badfdd5a09bce4e42a1b4998f1c0693601c9aa'}]}, 'timestamp': '2026-02-23 09:56:56.230785', '_unique_id': '9e7d78e5b21b4be6990d635dd9e6876e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 13700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b6f5c72-fe4f-48cc-973c-7ffb24cf45ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13700000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:56:56.232342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fccbdc0a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.419708015, 'message_signature': '77e25bc5a17c50048a2571569f9c094afd818d1c62ea111872812fefe9c2acc7'}]}, 'timestamp': '2026-02-23 09:56:56.232549', '_unique_id': '06c2a898dec0488fa23ff2506741f42a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1edb2890-7ebc-4b49-a7aa-42ba1822fd6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.233500', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc095a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': 'cefa7aacf0009fad796f984e5abcbb3fd03a4257f75c7d9ea63e57a704e69800'}]}, 'timestamp': '2026-02-23 09:56:56.233714', '_unique_id': 'd8e8179a5c6b46ae871bc82dd8479bed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00bd1090-0aa6-430b-a47a-c5fe5714dc59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.234760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccc3a56-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '3e2c3326a8927a4f233a899e4634e8893e3fd241045555486a85a3fa44cbcc73'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.234760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccc4276-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'ac5986d178ec894a3bbd764ccf4378c116a036c3424ef7feab5d48a00af6711b'}]}, 'timestamp': '2026-02-23 09:56:56.235162', '_unique_id': 'a1267f309e57426eb5121827c4e3c4a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59e39053-beb3-43ad-a797-f41f56bf7f73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.236129', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc6fda-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '047a2796f43520a1ba449f7fbe072f0ee9365138d71d429734719079ba1136c0'}]}, 'timestamp': '2026-02-23 09:56:56.236337', '_unique_id': 'a615d1f5095743e98c2c8c30b8ea9651'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '410da362-33ab-4de3-8359-f0269045e4f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.237288', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc9d16-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': 'd856f632dac75183d63cf6a266c33ef7ab9d0822e985cbd43a04e6587a99707d'}]}, 'timestamp': '2026-02-23 09:56:56.237496', '_unique_id': '968648cdd6154eca813e82812cc224c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55223e62-4424-4f26-b20c-2b28d73f19db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.238452', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcccca98-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '8c292fbc0cc199012fb461b2a8a26f5192eb613694006581b008420442a5f88f'}]}, 'timestamp': '2026-02-23 09:56:56.238661', '_unique_id': '55a5555339de4969a637ce59694fbf0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f1b036-8927-4d72-bdf9-4b27aa9ea2ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.239601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcccf75c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'cd7c6fa981d4acc3b795b3f57fd737cb5ca10fcf81fe557ecec16f5017b31fae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.239601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcccff18-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '7444efe10927925292e19018d94d342456484dfe728649588ff0dbfce84feb70'}]}, 'timestamp': '2026-02-23 09:56:56.239993', '_unique_id': '70a29b9903554df2ab004c29aa4b4acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '112f18bd-2243-4656-97a8-e71d73a58571', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.240955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccd2c5e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'f62585010c67e6e3cc6b9311d5b2b0b5edfa3ec7efb773a269f55df182ca4b95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.240955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccd3384-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '7cafdff3fce54c33429659dd61013c32a2e5bd9ffd514ba2a3af8e93a7ea52a4'}]}, 'timestamp': '2026-02-23 09:56:56.241333', '_unique_id': '001868957c5846a9a72f90e678cdd0e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94d3ebad-6450-495e-bc2e-4c09962c479d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.242309', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccd6142-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '0515c2e51e4b619907699068af492a9bfb0f5cc49ae8ba7b35897c32443ee902'}]}, 'timestamp': '2026-02-23 09:56:56.242517', '_unique_id': '6ab435671e4a40e1aa19cbc4e27d0605'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e730e557-7738-4936-a7ac-5c9f76772ab5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.243447', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccd8dac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '0b2124eb8fc21bffa71622127f5f0f55dedf757b810426415b676441a0cd176d'}]}, 'timestamp': '2026-02-23 09:56:56.243654', '_unique_id': 'b8dcfb95008f4ef4837af2f90408b2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42f90fab-d961-46e5-a769-daaf64b110f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.244768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccdc1d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '2421ac46e7a1eeaf960243dd59bc72cd403ed406a5ac80c232ca083007784916'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.244768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccdca2e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'cbdbee54cd02b7762607bec45c931d366d12e45775e123576a299c85264e1091'}]}, 'timestamp': '2026-02-23 09:56:56.245190', '_unique_id': '37bc28ea8c474e9ab38487d0dac7d098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:56:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:56:56 np0005626463.localdomain dnsmasq[313115]: exiting on receipt of SIGTERM
Feb 23 09:56:56 np0005626463.localdomain podman[313349]: 2026-02-23 09:56:56.616563589 +0000 UTC m=+0.061751509 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:56:56 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:56:56 np0005626463.localdomain systemd[1]: libpod-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope: Deactivated successfully.
Feb 23 09:56:56 np0005626463.localdomain podman[313362]: 2026-02-23 09:56:56.682451984 +0000 UTC m=+0.053857535 container died a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 23 09:56:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1-userdata-shm.mount: Deactivated successfully.
Feb 23 09:56:56 np0005626463.localdomain podman[313362]: 2026-02-23 09:56:56.718031623 +0000 UTC m=+0.089437094 container cleanup a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:56:56 np0005626463.localdomain systemd[1]: libpod-conmon-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope: Deactivated successfully.
Feb 23 09:56:56 np0005626463.localdomain podman[313364]: 2026-02-23 09:56:56.739218598 +0000 UTC m=+0.097642987 container remove a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 09:56:56 np0005626463.localdomain kernel: device tap6b8e941b-e3 left promiscuous mode
Feb 23 09:56:56 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:56Z|00200|binding|INFO|Releasing lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 from this chassis (sb_readonly=0)
Feb 23 09:56:56 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:56Z|00201|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 down in Southbound
Feb 23 09:56:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:56.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:56 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:56.782 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8630a66fd9f41828b0bd2cf93b5956f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c6d3781-7aae-4474-bf2c-0e950a13f37c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6b8e941b-e318-43c8-8da1-efc8c08d0ac8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:56:56 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:56.784 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 in datapath 2fe91a2a-5b02-4767-89ca-7f8954141d90 unbound from our chassis
Feb 23 09:56:56 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:56.785 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fe91a2a-5b02-4767-89ca-7f8954141d90 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:56:56 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:56:56.785 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[a8393d00-fe05-4f60-a0dc-40681a74f55d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:56:56 np0005626463.localdomain podman[313365]: 2026-02-23 09:56:56.786022864 +0000 UTC m=+0.149073307 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 23 09:56:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:56.800 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:56 np0005626463.localdomain podman[313365]: 2026-02-23 09:56:56.825520114 +0000 UTC m=+0.188570507 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container)
Feb 23 09:56:56 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:56:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:56.865 265541 INFO neutron.agent.dhcp.agent [None req-abb12fa1-32ad-49de-b347-2217f17a4166 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:56:57 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-bb5a3a963be7354e78652f4a92c2c34304d37c406133dfb79b7eec719c4f26a2-merged.mount: Deactivated successfully.
Feb 23 09:56:57 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d2fe91a2a\x2d5b02\x2d4767\x2d89ca\x2d7f8954141d90.mount: Deactivated successfully.
Feb 23 09:56:57 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:56:57.823 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:56:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:57.964 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:58 np0005626463.localdomain ceph-mon[294160]: pgmap v220: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:56:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:58.199 2 INFO neutron.agent.securitygroups_rpc [None req-527daa95-1754-43ce-9cc1-4adbaa0b338f d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:56:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:56:58.216 2 INFO neutron.agent.securitygroups_rpc [None req-7c0e711d-1075-4bc5-843f-4c1a8d666461 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:56:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:56:58Z|00202|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:56:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:58.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:56:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:56:58.871 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:00 np0005626463.localdomain ceph-mon[294160]: pgmap v221: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:00 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:00Z|00203|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:57:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:00.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:01Z|00204|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:57:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:01.132 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:01.837 2 INFO neutron.agent.securitygroups_rpc [None req-7a651b51-72a3-44e2-bd5a-07e4363f8263 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']
Feb 23 09:57:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:02.041 2 INFO neutron.agent.securitygroups_rpc [None req-861b86c2-eda9-4017-bb33-3609a7ddd89c d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:57:02 np0005626463.localdomain ceph-mon[294160]: pgmap v222: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:02.967 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:04 np0005626463.localdomain ceph-mon[294160]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2669093195' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:57:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2669093195' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:57:05 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:05.231 265541 INFO neutron.agent.linux.ip_lib [None req-2f446502-99ed-410f-a2d7-3fc6d870c5d9 - - - - - -] Device tap73a840c6-4a cannot be used as it has no MAC address
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.250 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain kernel: device tap73a840c6-4a entered promiscuous mode
Feb 23 09:57:05 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840625.2577] manager: (tap73a840c6-4a): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.258 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:05Z|00205|binding|INFO|Claiming lport 73a840c6-4a09-4a70-b6d2-862b377cac9d for this chassis.
Feb 23 09:57:05 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:05Z|00206|binding|INFO|73a840c6-4a09-4a70-b6d2-862b377cac9d: Claiming unknown
Feb 23 09:57:05 np0005626463.localdomain systemd-udevd[313421]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:05.270 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aef873a00904cab867a4692ec3a78cb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ce95503-eca6-4225-b8ee-cabdbb8ec29c, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=73a840c6-4a09-4a70-b6d2-862b377cac9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:05.278 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 73a840c6-4a09-4a70-b6d2-862b377cac9d in datapath 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 bound to our chassis
Feb 23 09:57:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:05.280 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:57:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:05.281 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ccee22-37fe-4862-b839-b03c4c61d631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:05Z|00207|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d ovn-installed in OVS
Feb 23 09:57:05 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:05Z|00208|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d up in Southbound
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.294 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.300 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap73a840c6-4a: No such device
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.340 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:05.369 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:05 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:05.519 2 INFO neutron.agent.securitygroups_rpc [None req-12c23e79-c442-4ba1-a515-53d2f774a574 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:57:06 np0005626463.localdomain podman[313492]: 
Feb 23 09:57:06 np0005626463.localdomain podman[313492]: 2026-02-23 09:57:06.214309464 +0000 UTC m=+0.087610128 container create 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:57:06 np0005626463.localdomain systemd[1]: Started libpod-conmon-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope.
Feb 23 09:57:06 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:06 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/658cf474e63c73733b33588ec74e8a0b39a8eb6542d46aec9fd53fcc5cc3218f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:06 np0005626463.localdomain podman[313492]: 2026-02-23 09:57:06.182236752 +0000 UTC m=+0.055537416 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:06 np0005626463.localdomain ceph-mon[294160]: pgmap v224: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:06 np0005626463.localdomain podman[313492]: 2026-02-23 09:57:06.28347073 +0000 UTC m=+0.156771384 container init 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 09:57:06 np0005626463.localdomain podman[313492]: 2026-02-23 09:57:06.292057096 +0000 UTC m=+0.165357750 container start 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 09:57:06 np0005626463.localdomain dnsmasq[313510]: started, version 2.85 cachesize 150
Feb 23 09:57:06 np0005626463.localdomain dnsmasq[313510]: DNS service limited to local subnets
Feb 23 09:57:06 np0005626463.localdomain dnsmasq[313510]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:06 np0005626463.localdomain dnsmasq[313510]: warning: no upstream servers configured
Feb 23 09:57:06 np0005626463.localdomain dnsmasq-dhcp[313510]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:57:06 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 0 addresses
Feb 23 09:57:06 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:06 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:06 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:06.454 265541 INFO neutron.agent.dhcp.agent [None req-aaa66832-cd7b-4b68-b1e2-5ff8487e1f57 - - - - - -] DHCP configuration for ports {'237004ec-c4e6-4656-8fdf-edc428916b84'} is completed
Feb 23 09:57:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:57:07 np0005626463.localdomain podman[313511]: 2026-02-23 09:57:07.155115381 +0000 UTC m=+0.078806287 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:57:07 np0005626463.localdomain podman[313511]: 2026-02-23 09:57:07.171189717 +0000 UTC m=+0.094880613 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:57:07 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:57:07 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:57:07 np0005626463.localdomain podman[313532]: 2026-02-23 09:57:07.24183849 +0000 UTC m=+0.056355233 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 23 09:57:07 np0005626463.localdomain podman[313532]: 2026-02-23 09:57:07.326283899 +0000 UTC m=+0.140800682 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 09:57:07 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:57:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:07.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:08 np0005626463.localdomain ceph-mon[294160]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:08 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:08.919 2 INFO neutron.agent.securitygroups_rpc [None req-0e398338-d100-41a6-9496-902001e5b466 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']
Feb 23 09:57:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:08.971 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294344f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829bf7340>], id=8cb7a230-2eec-4e99-ae4f-fdf93493284b, ip_allocation=immediate, mac_address=fa:16:3e:fc:9e:c0, name=tempest-ExtraDHCPOptionsTestJSON-1719479753, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:02Z, description=, dns_domain=, id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-2021646158, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1542, status=ACTIVE, subnets=['357cab67-d9ec-4b7b-abb7-813d68b84f46'], tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:03Z, vlan_transparent=None, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1576, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:08Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7
Feb 23 09:57:09 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses
Feb 23 09:57:09 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:09 np0005626463.localdomain podman[313574]: 2026-02-23 09:57:09.219017165 +0000 UTC m=+0.060590913 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:09 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:09.306 2 INFO neutron.agent.securitygroups_rpc [None req-bf52f67a-8b52-4e6b-9a4f-c7fa470a3b71 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']
Feb 23 09:57:09 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:09.360 265541 INFO neutron.agent.linux.ip_lib [None req-172401bf-454c-4035-90c6-47dae912c45b - - - - - -] Device tap3200d85d-83 cannot be used as it has no MAC address
Feb 23 09:57:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:57:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain kernel: device tap3200d85d-83 entered promiscuous mode
Feb 23 09:57:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:57:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 23 09:57:09 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840629.4041] manager: (tap3200d85d-83): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Feb 23 09:57:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:09Z|00209|binding|INFO|Claiming lport 3200d85d-8350-45a5-8b75-5b140c0b6067 for this chassis.
Feb 23 09:57:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:09Z|00210|binding|INFO|3200d85d-8350-45a5-8b75-5b140c0b6067: Claiming unknown
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain systemd-udevd[313604]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:09.416 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:2224/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13ab81953d004010a22a72d978d31c4d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8718afba-f509-414c-83e3-3464c2a9d2a2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=3200d85d-8350-45a5-8b75-5b140c0b6067) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:09.418 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 3200d85d-8350-45a5-8b75-5b140c0b6067 in datapath b12395f4-8746-4a5f-8dd6-aa83c6decb4b bound to our chassis
Feb 23 09:57:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:09.422 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4eaffcb3-e5a2-46b4-90f8-068ef050ee16 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:57:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:09.422 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b12395f4-8746-4a5f-8dd6-aa83c6decb4b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:09 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:09.423 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f99000e5-f8dd-43ee-930f-e93a62a43587]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:57:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19279 "" "Go-http-client/1.1"
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:09Z|00211|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 ovn-installed in OVS
Feb 23 09:57:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:09Z|00212|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 up in Southbound
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.460 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap3200d85d-83: No such device
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:09.524 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:09 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:09.571 265541 INFO neutron.agent.dhcp.agent [None req-03869fb1-26fd-47c1-b63c-0635cf29e0b5 - - - - - -] DHCP configuration for ports {'8cb7a230-2eec-4e99-ae4f-fdf93493284b'} is completed
Feb 23 09:57:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:09.687 2 INFO neutron.agent.securitygroups_rpc [None req-459134b2-4c8a-4e1c-97e8-455a751256d3 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:57:10 np0005626463.localdomain ceph-mon[294160]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:10 np0005626463.localdomain podman[313675]: 
Feb 23 09:57:10 np0005626463.localdomain podman[313675]: 2026-02-23 09:57:10.361913975 +0000 UTC m=+0.086793073 container create c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:10 np0005626463.localdomain systemd[1]: Started libpod-conmon-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope.
Feb 23 09:57:10 np0005626463.localdomain podman[313675]: 2026-02-23 09:57:10.319275758 +0000 UTC m=+0.044154916 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:10 np0005626463.localdomain systemd[1]: tmp-crun.TmZwYe.mount: Deactivated successfully.
Feb 23 09:57:10 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:10 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b96123a8fc3824dd5eb5d08dfdf32746d180cd8d815e2da97bc82b4af6ecaa2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:10 np0005626463.localdomain podman[313675]: 2026-02-23 09:57:10.451706459 +0000 UTC m=+0.176585567 container init c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:57:10 np0005626463.localdomain podman[313675]: 2026-02-23 09:57:10.462159212 +0000 UTC m=+0.187038320 container start c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: started, version 2.85 cachesize 150
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: DNS service limited to local subnets
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: warning: no upstream servers configured
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313695]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/addn_hosts - 0 addresses
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/host
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/opts
Feb 23 09:57:10 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:10.522 2 INFO neutron.agent.securitygroups_rpc [None req-fe5ed929-0971-4d54-b2a5-f195cec2efa3 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']
Feb 23 09:57:10 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.522 265541 INFO neutron.agent.dhcp.agent [None req-172401bf-454c-4035-90c6-47dae912c45b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829262fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829262100>], id=f22d0768-46f3-45fe-bc61-792e40f72919, ip_allocation=immediate, mac_address=fa:16:3e:6e:cd:24, name=tempest-NetworksIpV6TestAttrs-136630378, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:03Z, description=, dns_domain=, id=b12395f4-8746-4a5f-8dd6-aa83c6decb4b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-1394058243, port_security_enabled=True, project_id=13ab81953d004010a22a72d978d31c4d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1550, status=ACTIVE, subnets=['6b0f1695-d31e-45f7-9ddd-6677f996c496'], tags=[], tenant_id=13ab81953d004010a22a72d978d31c4d, updated_at=2026-02-23T09:57:05Z, vlan_transparent=None, network_id=b12395f4-8746-4a5f-8dd6-aa83c6decb4b, port_security_enabled=True, project_id=13ab81953d004010a22a72d978d31c4d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b31260d7-60e9-40f6-abcc-b3b02fd41e3c'], standard_attr_id=1578, status=DOWN, tags=[], tenant_id=13ab81953d004010a22a72d978d31c4d, updated_at=2026-02-23T09:57:09Z on network b12395f4-8746-4a5f-8dd6-aa83c6decb4b
Feb 23 09:57:10 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.593 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:09Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c47640>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282936c580>, <neutron.agent.linux.dhcp.DictModel object at 0x7f2829c47bb0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f2829c47cd0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282936c490>], id=bbf20927-9fb7-4141-8e3b-4dfa8e35b2b9, ip_allocation=immediate, mac_address=fa:16:3e:1d:8b:96, name=tempest-ExtraDHCPOptionsTestJSON-1081097454, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:02Z, description=, dns_domain=, id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-2021646158, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1542, status=ACTIVE, subnets=['357cab67-d9ec-4b7b-abb7-813d68b84f46'], tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:03Z, vlan_transparent=None, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1580, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:09Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7
Feb 23 09:57:10 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.640 265541 INFO neutron.agent.dhcp.agent [None req-194fcf6d-77de-4d38-bb4c-72537dfbac3e - - - - - -] DHCP configuration for ports {'463c3006-8123-4c7a-9689-b280160ef34c'} is completed
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/addn_hosts - 1 addresses
Feb 23 09:57:10 np0005626463.localdomain podman[313715]: 2026-02-23 09:57:10.705029696 +0000 UTC m=+0.054698381 container kill c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/host
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/opts
Feb 23 09:57:10 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:10.713 2 INFO neutron.agent.securitygroups_rpc [None req-3f245d6d-a6d1-496c-9e86-70a6eb49954a d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']
Feb 23 09:57:10 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 2 addresses
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:10 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:10 np0005626463.localdomain podman[313746]: 2026-02-23 09:57:10.832227936 +0000 UTC m=+0.047828899 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:57:10 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.880 265541 INFO neutron.agent.dhcp.agent [None req-642c9972-ee2d-40ea-8224-86c8d4e0779b - - - - - -] DHCP configuration for ports {'f22d0768-46f3-45fe-bc61-792e40f72919'} is completed
Feb 23 09:57:11 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:11.090 265541 INFO neutron.agent.dhcp.agent [None req-546a045c-d9bb-4620-8897-b5ab8e72ae7c - - - - - -] DHCP configuration for ports {'bbf20927-9fb7-4141-8e3b-4dfa8e35b2b9'} is completed
Feb 23 09:57:11 np0005626463.localdomain dnsmasq[313695]: exiting on receipt of SIGTERM
Feb 23 09:57:11 np0005626463.localdomain podman[313788]: 2026-02-23 09:57:11.299628136 +0000 UTC m=+0.060276083 container kill c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 09:57:11 np0005626463.localdomain systemd[1]: libpod-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope: Deactivated successfully.
Feb 23 09:57:11 np0005626463.localdomain systemd[1]: tmp-crun.liA9ph.mount: Deactivated successfully.
Feb 23 09:57:11 np0005626463.localdomain podman[313801]: 2026-02-23 09:57:11.371378773 +0000 UTC m=+0.055686791 container died c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 09:57:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c-userdata-shm.mount: Deactivated successfully.
Feb 23 09:57:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5b96123a8fc3824dd5eb5d08dfdf32746d180cd8d815e2da97bc82b4af6ecaa2-merged.mount: Deactivated successfully.
Feb 23 09:57:11 np0005626463.localdomain podman[313801]: 2026-02-23 09:57:11.403620209 +0000 UTC m=+0.087928187 container cleanup c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:57:11 np0005626463.localdomain systemd[1]: libpod-conmon-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope: Deactivated successfully.
Feb 23 09:57:11 np0005626463.localdomain podman[313802]: 2026-02-23 09:57:11.44281531 +0000 UTC m=+0.123696923 container remove c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:11Z|00213|binding|INFO|Releasing lport 3200d85d-8350-45a5-8b75-5b140c0b6067 from this chassis (sb_readonly=0)
Feb 23 09:57:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:11.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:11Z|00214|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 down in Southbound
Feb 23 09:57:11 np0005626463.localdomain kernel: device tap3200d85d-83 left promiscuous mode
Feb 23 09:57:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:11.506 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:2224/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13ab81953d004010a22a72d978d31c4d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8718afba-f509-414c-83e3-3464c2a9d2a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=3200d85d-8350-45a5-8b75-5b140c0b6067) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:11.508 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 3200d85d-8350-45a5-8b75-5b140c0b6067 in datapath b12395f4-8746-4a5f-8dd6-aa83c6decb4b unbound from our chassis
Feb 23 09:57:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:11.511 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b12395f4-8746-4a5f-8dd6-aa83c6decb4b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:11.512 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1810e062-aff9-4a21-9182-6e76399d6a0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:11.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:11 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:11.764 265541 INFO neutron.agent.dhcp.agent [None req-29575065-45e4-45d7-be30-be1abb3dc396 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:11 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:11.794 2 INFO neutron.agent.securitygroups_rpc [None req-a8c239ff-03aa-42a5-97c5-b84d2c82f384 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']
Feb 23 09:57:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:12 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses
Feb 23 09:57:12 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:12 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:12 np0005626463.localdomain podman[313847]: 2026-02-23 09:57:12.02901136 +0000 UTC m=+0.056351532 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:57:12 np0005626463.localdomain ceph-mon[294160]: pgmap v227: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:57:12 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2db12395f4\x2d8746\x2d4a5f\x2d8dd6\x2daa83c6decb4b.mount: Deactivated successfully.
Feb 23 09:57:12 np0005626463.localdomain podman[313869]: 2026-02-23 09:57:12.396146943 +0000 UTC m=+0.075491012 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:57:12 np0005626463.localdomain podman[313869]: 2026-02-23 09:57:12.43323695 +0000 UTC m=+0.112581019 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:12 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:57:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:12.628 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345160>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345760>, <neutron.agent.linux.dhcp.DictModel object at 0x7f2829345070>, <neutron.agent.linux.dhcp.DictModel object at 0x7f2829345be0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829345460>], id=8cb7a230-2eec-4e99-ae4f-fdf93493284b, ip_allocation=immediate, mac_address=fa:16:3e:fc:9e:c0, name=tempest-new-port-name-1817703293, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1576, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:12Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7
Feb 23 09:57:12 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses
Feb 23 09:57:12 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:12 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:12 np0005626463.localdomain podman[313903]: 2026-02-23 09:57:12.848216221 +0000 UTC m=+0.060412848 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:13.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:13.086 265541 INFO neutron.agent.dhcp.agent [None req-03682992-3a6f-4acd-a296-ef8b2025e96c - - - - - -] DHCP configuration for ports {'8cb7a230-2eec-4e99-ae4f-fdf93493284b'} is completed
Feb 23 09:57:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:57:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:57:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:57:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:57:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:57:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:57:13 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:13.525 2 INFO neutron.agent.securitygroups_rpc [None req-7a9cb938-6590-4a31-87fa-171ce7ffabc7 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['feaffec7-34aa-4c16-87f3-892bafdc2b78']
Feb 23 09:57:14 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:14.209 2 INFO neutron.agent.securitygroups_rpc [None req-d957f830-6dae-4ea3-8f5c-2c9a9324e520 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']
Feb 23 09:57:14 np0005626463.localdomain ceph-mon[294160]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:14 np0005626463.localdomain dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 0 addresses
Feb 23 09:57:14 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host
Feb 23 09:57:14 np0005626463.localdomain podman[313939]: 2026-02-23 09:57:14.450331679 +0000 UTC m=+0.055868377 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:14 np0005626463.localdomain dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts
Feb 23 09:57:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:57:14 np0005626463.localdomain podman[313959]: 2026-02-23 09:57:14.911468096 +0000 UTC m=+0.085847323 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:57:14 np0005626463.localdomain podman[313959]: 2026-02-23 09:57:14.945370713 +0000 UTC m=+0.119749900 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 23 09:57:14 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain dnsmasq[313510]: exiting on receipt of SIGTERM
Feb 23 09:57:15 np0005626463.localdomain podman[313994]: 2026-02-23 09:57:15.21327145 +0000 UTC m=+0.061256173 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:15 np0005626463.localdomain systemd[1]: libpod-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain podman[314006]: 2026-02-23 09:57:15.294428148 +0000 UTC m=+0.067508207 container died 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:15 np0005626463.localdomain podman[314006]: 2026-02-23 09:57:15.325188658 +0000 UTC m=+0.098268657 container cleanup 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:15 np0005626463.localdomain systemd[1]: libpod-conmon-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain podman[314008]: 2026-02-23 09:57:15.366724401 +0000 UTC m=+0.132587377 container remove 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:15Z|00215|binding|INFO|Releasing lport 73a840c6-4a09-4a70-b6d2-862b377cac9d from this chassis (sb_readonly=0)
Feb 23 09:57:15 np0005626463.localdomain kernel: device tap73a840c6-4a left promiscuous mode
Feb 23 09:57:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:15Z|00216|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d down in Southbound
Feb 23 09:57:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:15.381 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:15.387 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aef873a00904cab867a4692ec3a78cb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ce95503-eca6-4225-b8ee-cabdbb8ec29c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=73a840c6-4a09-4a70-b6d2-862b377cac9d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:15.389 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 73a840c6-4a09-4a70-b6d2-862b377cac9d in datapath 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 unbound from our chassis
Feb 23 09:57:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:15.392 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:15.392 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[cf369efd-107b-423d-b55f-912bb4cd0919]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:15.400 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-658cf474e63c73733b33588ec74e8a0b39a8eb6542d46aec9fd53fcc5cc3218f-merged.mount: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882-userdata-shm.mount: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d5c327632\x2d3675\x2d4ada\x2dbcc8\x2dd5fb15ecb5d7.mount: Deactivated successfully.
Feb 23 09:57:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:15.625 265541 INFO neutron.agent.dhcp.agent [None req-30e9e5c2-b621-45ea-9d63-28d07266ef16 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:16.039 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:16 np0005626463.localdomain ceph-mon[294160]: pgmap v229: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:16.609 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:17 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:17Z|00217|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:57:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:17.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:17 np0005626463.localdomain sshd[314037]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:57:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:17.712 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cdd9294-518a-4cd4-8e14-e309ee77be41, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a147cb7a-5506-4d6a-9946-52357210b7c0) old=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:17.715 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a147cb7a-5506-4d6a-9946-52357210b7c0 in datapath c762e206-cc42-4a9e-b8ad-4f8da87fd30e updated
Feb 23 09:57:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:17.718 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c762e206-cc42-4a9e-b8ad-4f8da87fd30e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:17.719 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5af42ff8-fc8a-4135-b1a3-01dd64211291]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:18.015 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:18 np0005626463.localdomain sshd[314037]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.028231) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638028269, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2313, "num_deletes": 262, "total_data_size": 2324773, "memory_usage": 2374528, "flush_reason": "Manual Compaction"}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638038077, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2261070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25262, "largest_seqno": 27574, "table_properties": {"data_size": 2251713, "index_size": 5862, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20328, "raw_average_key_size": 21, "raw_value_size": 2232616, "raw_average_value_size": 2332, "num_data_blocks": 255, "num_entries": 957, "num_filter_entries": 957, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840465, "oldest_key_time": 1771840465, "file_creation_time": 1771840638, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 9882 microseconds, and 4741 cpu microseconds.
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.038111) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2261070 bytes OK
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.038131) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042805) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042819) EVENT_LOG_v1 {"time_micros": 1771840638042814, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042836) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2315088, prev total WAL file size 2315088, number of live WAL files 2.
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.043562) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2208KB)], [45(17MB)]
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638043631, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 20138191, "oldest_snapshot_seqno": -1}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12589 keys, 16805521 bytes, temperature: kUnknown
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638146571, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16805521, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16732855, "index_size": 40117, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 336956, "raw_average_key_size": 26, "raw_value_size": 16517500, "raw_average_value_size": 1312, "num_data_blocks": 1533, "num_entries": 12589, "num_filter_entries": 12589, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840638, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.146963) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16805521 bytes
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.149239) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.5 rd, 163.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 17.0 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(16.3) write-amplify(7.4) OK, records in: 13126, records dropped: 537 output_compression: NoCompression
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.149276) EVENT_LOG_v1 {"time_micros": 1771840638149260, "job": 26, "event": "compaction_finished", "compaction_time_micros": 103025, "compaction_time_cpu_micros": 46138, "output_level": 6, "num_output_files": 1, "total_output_size": 16805521, "num_input_records": 13126, "num_output_records": 12589, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638149845, "job": 26, "event": "table_file_deletion", "file_number": 47}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638153152, "job": 26, "event": "table_file_deletion", "file_number": 45}
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.043461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:57:18 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:18.547 2 INFO neutron.agent.securitygroups_rpc [None req-c9ef2d4b-d1c0-41a7-a7dd-2f93a7b3fec0 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['feaffec7-34aa-4c16-87f3-892bafdc2b78', '5909553e-06f7-4a4f-a61c-c51f18e5203a']
Feb 23 09:57:19 np0005626463.localdomain sudo[314039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:57:19 np0005626463.localdomain sudo[314039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:57:19 np0005626463.localdomain sudo[314039]: pam_unix(sudo:session): session closed for user root
Feb 23 09:57:19 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:19.292 2 INFO neutron.agent.securitygroups_rpc [None req-37a94c58-b8ba-4f2f-a685-f32e299cd9ba 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['5909553e-06f7-4a4f-a61c-c51f18e5203a']
Feb 23 09:57:19 np0005626463.localdomain sudo[314057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:57:19 np0005626463.localdomain sudo[314057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:57:19 np0005626463.localdomain sudo[314057]: pam_unix(sudo:session): session closed for user root
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:57:20 np0005626463.localdomain sudo[314106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:57:20 np0005626463.localdomain sudo[314106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:57:20 np0005626463.localdomain sudo[314106]: pam_unix(sudo:session): session closed for user root
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:57:20 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:20.368 265541 INFO neutron.agent.linux.ip_lib [None req-25e29513-5b46-47b7-a32e-94580f62f62c - - - - - -] Device tapa809c819-39 cannot be used as it has no MAC address
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:20 np0005626463.localdomain kernel: device tapa809c819-39 entered promiscuous mode
Feb 23 09:57:20 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840640.4085] manager: (tapa809c819-39): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.407 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:20 np0005626463.localdomain systemd-udevd[314134]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa809c819-39: No such device
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.484 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:57:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:20.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:57:21 np0005626463.localdomain podman[314205]: 
Feb 23 09:57:21 np0005626463.localdomain podman[314205]: 2026-02-23 09:57:21.31476689 +0000 UTC m=+0.090098945 container create d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 09:57:21 np0005626463.localdomain systemd[1]: Started libpod-conmon-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope.
Feb 23 09:57:21 np0005626463.localdomain podman[314205]: 2026-02-23 09:57:21.270685607 +0000 UTC m=+0.046017662 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:21 np0005626463.localdomain systemd[1]: tmp-crun.Ys8o94.mount: Deactivated successfully.
Feb 23 09:57:21 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:21 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42a7192ddb811b49db5c05e8b220a055625d570407e7c102749a3e2675b6520/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:21 np0005626463.localdomain podman[314205]: 2026-02-23 09:57:21.411940732 +0000 UTC m=+0.187272797 container init d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:21 np0005626463.localdomain podman[314205]: 2026-02-23 09:57:21.420562888 +0000 UTC m=+0.195894943 container start d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: started, version 2.85 cachesize 150
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: DNS service limited to local subnets
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: warning: no upstream servers configured
Feb 23 09:57:21 np0005626463.localdomain dnsmasq-dhcp[314223]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/addn_hosts - 0 addresses
Feb 23 09:57:21 np0005626463.localdomain dnsmasq-dhcp[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/host
Feb 23 09:57:21 np0005626463.localdomain dnsmasq-dhcp[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/opts
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.455 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.455 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.456 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.456 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:57:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:57:21 np0005626463.localdomain ceph-mon[294160]: pgmap v232: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.562 265541 INFO neutron.agent.dhcp.agent [None req-6d2f5670-a36c-4149-a35c-fe66671e2499 - - - - - -] DHCP configuration for ports {'bea39c1a-6fe3-4e05-84fb-8c695be45a3d'} is completed
Feb 23 09:57:21 np0005626463.localdomain dnsmasq[314223]: exiting on receipt of SIGTERM
Feb 23 09:57:21 np0005626463.localdomain systemd[1]: libpod-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope: Deactivated successfully.
Feb 23 09:57:21 np0005626463.localdomain podman[314240]: 2026-02-23 09:57:21.666798576 +0000 UTC m=+0.061986406 container kill d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:21 np0005626463.localdomain podman[314252]: 2026-02-23 09:57:21.737409447 +0000 UTC m=+0.057303431 container died d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 09:57:21 np0005626463.localdomain podman[314252]: 2026-02-23 09:57:21.763433651 +0000 UTC m=+0.083327615 container cleanup d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:57:21 np0005626463.localdomain systemd[1]: libpod-conmon-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope: Deactivated successfully.
Feb 23 09:57:21 np0005626463.localdomain podman[314254]: 2026-02-23 09:57:21.816423298 +0000 UTC m=+0.129600445 container remove d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.829 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:21 np0005626463.localdomain kernel: device tapa809c819-39 left promiscuous mode
Feb 23 09:57:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:21.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.864 265541 INFO neutron.agent.dhcp.agent [None req-a6b448ca-e319-42dd-83f5-345c00ce6e49 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.865 265541 INFO neutron.agent.dhcp.agent [None req-a6b448ca-e319-42dd-83f5-345c00ce6e49 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:22.242 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:57:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:22.272 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:57:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:22.273 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:57:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-f42a7192ddb811b49db5c05e8b220a055625d570407e7c102749a3e2675b6520-merged.mount: Deactivated successfully.
Feb 23 09:57:22 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a-userdata-shm.mount: Deactivated successfully.
Feb 23 09:57:22 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dc0f06419\x2da50d\x2d48bd\x2d89c2\x2d2b77ee893c23.mount: Deactivated successfully.
Feb 23 09:57:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1058112558' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:23.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:24 np0005626463.localdomain ceph-mon[294160]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/356798990' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:24.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:57:24 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:24.753 2 INFO neutron.agent.securitygroups_rpc [None req-3b70b1d6-deb1-439a-bfa9-157fec652445 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['c0f5985b-58d8-49b8-86bf-b98bcb003892']
Feb 23 09:57:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:57:24 np0005626463.localdomain podman[314282]: 2026-02-23 09:57:24.905974922 +0000 UTC m=+0.080783447 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 09:57:24 np0005626463.localdomain podman[314282]: 2026-02-23 09:57:24.917484947 +0000 UTC m=+0.092293812 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:57:24 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:57:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 do_prune osdmap full prune enabled
Feb 23 09:57:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 e128: 6 total, 6 up, 6 in
Feb 23 09:57:25 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in
Feb 23 09:57:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:25.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/361885459' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:25 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:25.189 2 INFO neutron.agent.securitygroups_rpc [None req-92aad836-c088-4438-915c-a0369f05fd7b 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['c0f5985b-58d8-49b8-86bf-b98bcb003892']
Feb 23 09:57:25 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:25.801 2 INFO neutron.agent.securitygroups_rpc [None req-76b2a14f-bec0-4305-8207-ebe5d9da3f5c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['4833eb66-b753-4947-9fd8-0b38ba04d2e6']
Feb 23 09:57:26 np0005626463.localdomain ceph-mon[294160]: osdmap e128: 6 total, 6 up, 6 in
Feb 23 09:57:26 np0005626463.localdomain ceph-mon[294160]: pgmap v235: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:57:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1659014629' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:26 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:26.561 265541 INFO neutron.agent.linux.ip_lib [None req-a47420e0-4163-4fed-9576-7de53f70d6c5 - - - - - -] Device tapb3ce4649-e6 cannot be used as it has no MAC address
Feb 23 09:57:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:26.580 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:26 np0005626463.localdomain kernel: device tapb3ce4649-e6 entered promiscuous mode
Feb 23 09:57:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:26Z|00218|binding|INFO|Claiming lport b3ce4649-e63b-416c-91b2-f98efa2cafaa for this chassis.
Feb 23 09:57:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:26Z|00219|binding|INFO|b3ce4649-e63b-416c-91b2-f98efa2cafaa: Claiming unknown
Feb 23 09:57:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:26.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:26 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840646.5909] manager: (tapb3ce4649-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Feb 23 09:57:26 np0005626463.localdomain systemd-udevd[314313]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:26 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:26.600 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84e01b3-7a51-49d7-acad-f20a75f6eb9b, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=b3ce4649-e63b-416c-91b2-f98efa2cafaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:26 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:26.601 163572 INFO neutron.agent.ovn.metadata.agent [-] Port b3ce4649-e63b-416c-91b2-f98efa2cafaa in datapath bc69bb6e-6101-4390-bb1b-15e16ba6649d bound to our chassis
Feb 23 09:57:26 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:26.602 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bc69bb6e-6101-4390-bb1b-15e16ba6649d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:57:26 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:26.603 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4e15933b-ad98-45fe-b495-704e6d07f7d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:26Z|00220|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa ovn-installed in OVS
Feb 23 09:57:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:26Z|00221|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa up in Southbound
Feb 23 09:57:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:26.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device
Feb 23 09:57:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:26.667 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:26.695 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:26 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:26.890 2 INFO neutron.agent.securitygroups_rpc [None req-0bf5f6c8-9298-4358-bc07-570223771cfc 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:27.443 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:27 np0005626463.localdomain podman[314382]: 
Feb 23 09:57:27 np0005626463.localdomain podman[314382]: 2026-02-23 09:57:27.562528036 +0000 UTC m=+0.089799385 container create 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216)
Feb 23 09:57:27 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:57:27 np0005626463.localdomain systemd[1]: Started libpod-conmon-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope.
Feb 23 09:57:27 np0005626463.localdomain systemd[1]: tmp-crun.63OFTK.mount: Deactivated successfully.
Feb 23 09:57:27 np0005626463.localdomain podman[314382]: 2026-02-23 09:57:27.520001773 +0000 UTC m=+0.047273172 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:27 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:27 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06e3421c55fa244bcd15bc7912f7bbd56b819c3903bda310ddc27d1afd8ce870/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:27 np0005626463.localdomain podman[314382]: 2026-02-23 09:57:27.639465744 +0000 UTC m=+0.166737103 container init 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, io.buildah.version=1.43.0)
Feb 23 09:57:27 np0005626463.localdomain dnsmasq[314412]: started, version 2.85 cachesize 150
Feb 23 09:57:27 np0005626463.localdomain dnsmasq[314412]: DNS service limited to local subnets
Feb 23 09:57:27 np0005626463.localdomain dnsmasq[314412]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:27 np0005626463.localdomain dnsmasq[314412]: warning: no upstream servers configured
Feb 23 09:57:27 np0005626463.localdomain dnsmasq-dhcp[314412]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:57:27 np0005626463.localdomain dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 0 addresses
Feb 23 09:57:27 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host
Feb 23 09:57:27 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts
Feb 23 09:57:27 np0005626463.localdomain podman[314397]: 2026-02-23 09:57:27.691960446 +0000 UTC m=+0.092177719 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, release=1770267347)
Feb 23 09:57:27 np0005626463.localdomain podman[314382]: 2026-02-23 09:57:27.69922851 +0000 UTC m=+0.226499869 container start 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:27 np0005626463.localdomain podman[314397]: 2026-02-23 09:57:27.72933353 +0000 UTC m=+0.129550773 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 23 09:57:27 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:27.734 2 INFO neutron.agent.securitygroups_rpc [None req-f7ddc684-702a-4c23-8012-3babeac19865 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:27 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:57:27 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:27.860 265541 INFO neutron.agent.dhcp.agent [None req-ea245279-e00b-4bbe-be88-7daf5f708a6c - - - - - -] DHCP configuration for ports {'2321dadf-6b0b-4e84-985a-8e2e5348f794'} is completed
Feb 23 09:57:28 np0005626463.localdomain ceph-mon[294160]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.6 KiB/s wr, 14 op/s
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.050 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:28.074 2 INFO neutron.agent.securitygroups_rpc [None req-40990976-1223-4b33-bf88-75d4d256e3c2 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.079 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.079 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.102 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.102 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.103 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.103 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.104 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:57:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:28.308 2 INFO neutron.agent.securitygroups_rpc [None req-db0e7eff-9ed0-4b03-af25-1355357a4e27 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:28.564 2 INFO neutron.agent.securitygroups_rpc [None req-31c5f26a-a932-4879-896f-b6452f801d39 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']
Feb 23 09:57:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:57:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/913075697' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:28.579 2 INFO neutron.agent.securitygroups_rpc [None req-6e46d629-2807-49ac-9764-efa5e97c15d9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.598 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:57:28 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:28.625 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.654 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.655 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.836 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.837 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11351MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.838 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.838 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:57:28 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:28.844 2 INFO neutron.agent.securitygroups_rpc [None req-6e38729b-cba3-4aa6-8317-eef924b8040d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.941 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.942 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.942 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:57:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:28.988 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:57:29 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:29.198 2 INFO neutron.agent.securitygroups_rpc [None req-08211a26-5630-4885-b1ea-4c86be58309d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/913075697' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:57:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2744435039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:29.462 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:57:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:29.469 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:57:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:29.500 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:57:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:29.504 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:57:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:29.504 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:57:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:29.652 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cdd9294-518a-4cd4-8e14-e309ee77be41, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a147cb7a-5506-4d6a-9946-52357210b7c0) old=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:29.654 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a147cb7a-5506-4d6a-9946-52357210b7c0 in datapath c762e206-cc42-4a9e-b8ad-4f8da87fd30e updated
Feb 23 09:57:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:29.657 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c762e206-cc42-4a9e-b8ad-4f8da87fd30e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:29 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:29.658 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[df72594f-aac3-474f-bfdd-83ee0f141a58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:29 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:29.674 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:29 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:29.848 2 INFO neutron.agent.securitygroups_rpc [None req-73652ffc-e47d-466c-823f-eff68fd895b7 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:30 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:30.143 2 INFO neutron.agent.securitygroups_rpc [None req-fa7b29ab-7d8a-4213-9062-a27a98eacca6 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:30 np0005626463.localdomain ceph-mon[294160]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.6 KiB/s wr, 14 op/s
Feb 23 09:57:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2744435039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:57:31 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:31.273 2 INFO neutron.agent.securitygroups_rpc [None req-9681c33b-6d83-4d64-b717-f3c8315322b0 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']
Feb 23 09:57:31 np0005626463.localdomain ceph-mon[294160]: pgmap v238: 177 pgs: 177 active+clean; 169 MiB data, 858 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.4 MiB/s wr, 24 op/s
Feb 23 09:57:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:31.403 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:31.404 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:57:31 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:31.448 2 INFO neutron.agent.securitygroups_rpc [None req-e145e96f-8a3e-446a-b7a0-f3b217974e58 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']
Feb 23 09:57:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:31.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:32 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:32.150 2 INFO neutron.agent.securitygroups_rpc [None req-3adfef6e-10be-4bfb-8ba9-878d58df2fe8 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['dd2fbf50-7988-4b4c-88a5-46b24a60bfee', 'b620ad4c-7f28-46c7-a322-d11687a2bc43', '4833eb66-b753-4947-9fd8-0b38ba04d2e6']
Feb 23 09:57:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:32.480 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:57:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:33.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:33 np0005626463.localdomain ceph-mon[294160]: pgmap v239: 177 pgs: 177 active+clean; 209 MiB data, 946 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 6.4 MiB/s wr, 36 op/s
Feb 23 09:57:33 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:33.636 2 INFO neutron.agent.securitygroups_rpc [None req-e46f16e8-8d8a-4032-be19-52144050a32d 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['dd2fbf50-7988-4b4c-88a5-46b24a60bfee', 'b620ad4c-7f28-46c7-a322-d11687a2bc43']
Feb 23 09:57:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:33.877 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:33Z, description=, device_id=7647edfc-3361-4769-b093-9cecdc6821d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829be0460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829be0fd0>], id=fd50e0c4-aa06-4c9b-842f-9aaadd740dc1, ip_allocation=immediate, mac_address=fa:16:3e:1a:97:03, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:22Z, description=, dns_domain=, id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-336483506, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9099, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1679, status=ACTIVE, subnets=['2defc5bd-3127-4ac7-b565-39a0a9014c59'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:24Z, vlan_transparent=None, network_id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1739, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:33Z on network bc69bb6e-6101-4390-bb1b-15e16ba6649d
Feb 23 09:57:33 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:33.911 2 INFO neutron.agent.securitygroups_rpc [None req-a5ef7891-4d9f-416c-9e05-7114c855bf40 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['b8e8f331-db67-4dd9-802a-05abdcea8dd4']
Feb 23 09:57:34 np0005626463.localdomain podman[314483]: 2026-02-23 09:57:34.078166842 +0000 UTC m=+0.061277684 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:34 np0005626463.localdomain systemd[1]: tmp-crun.gVNq9F.mount: Deactivated successfully.
Feb 23 09:57:34 np0005626463.localdomain dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 1 addresses
Feb 23 09:57:34 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host
Feb 23 09:57:34 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts
Feb 23 09:57:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:34.328 265541 INFO neutron.agent.dhcp.agent [None req-74e35aad-a10d-4d66-8adc-b15e3d8d9b18 - - - - - -] DHCP configuration for ports {'fd50e0c4-aa06-4c9b-842f-9aaadd740dc1'} is completed
Feb 23 09:57:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 do_prune osdmap full prune enabled
Feb 23 09:57:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e129 e129: 6 total, 6 up, 6 in
Feb 23 09:57:34 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in
Feb 23 09:57:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:34.870 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:35 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:35.069 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:33Z, description=, device_id=7647edfc-3361-4769-b093-9cecdc6821d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829203070>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292038b0>], id=fd50e0c4-aa06-4c9b-842f-9aaadd740dc1, ip_allocation=immediate, mac_address=fa:16:3e:1a:97:03, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:22Z, description=, dns_domain=, id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-336483506, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9099, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1679, status=ACTIVE, subnets=['2defc5bd-3127-4ac7-b565-39a0a9014c59'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:24Z, vlan_transparent=None, network_id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1739, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:33Z on network bc69bb6e-6101-4390-bb1b-15e16ba6649d
Feb 23 09:57:35 np0005626463.localdomain dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 1 addresses
Feb 23 09:57:35 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host
Feb 23 09:57:35 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts
Feb 23 09:57:35 np0005626463.localdomain podman[314521]: 2026-02-23 09:57:35.281703756 +0000 UTC m=+0.062363547 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:57:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e129 do_prune osdmap full prune enabled
Feb 23 09:57:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e130 e130: 6 total, 6 up, 6 in
Feb 23 09:57:35 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in
Feb 23 09:57:35 np0005626463.localdomain ceph-mon[294160]: osdmap e129: 6 total, 6 up, 6 in
Feb 23 09:57:35 np0005626463.localdomain ceph-mon[294160]: pgmap v241: 177 pgs: 177 active+clean; 209 MiB data, 946 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 6.4 MiB/s wr, 36 op/s
Feb 23 09:57:35 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:35.522 265541 INFO neutron.agent.dhcp.agent [None req-a64dc3ee-fcca-4965-b775-95ac3b7231c7 - - - - - -] DHCP configuration for ports {'fd50e0c4-aa06-4c9b-842f-9aaadd740dc1'} is completed
Feb 23 09:57:35 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:35.814 2 INFO neutron.agent.securitygroups_rpc [None req-db00dc8c-6ada-4161-9538-1913f2da78b1 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['a98db953-42d0-4f19-90d3-a50bfc8bf55e']
Feb 23 09:57:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e130 do_prune osdmap full prune enabled
Feb 23 09:57:36 np0005626463.localdomain ceph-mon[294160]: osdmap e130: 6 total, 6 up, 6 in
Feb 23 09:57:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 e131: 6 total, 6 up, 6 in
Feb 23 09:57:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in
Feb 23 09:57:36 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:36.863 2 INFO neutron.agent.securitygroups_rpc [None req-e2d7d9c7-9cf6-4713-8a6e-85619a152752 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['a98db953-42d0-4f19-90d3-a50bfc8bf55e']
Feb 23 09:57:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:37 np0005626463.localdomain ceph-mon[294160]: osdmap e131: 6 total, 6 up, 6 in
Feb 23 09:57:37 np0005626463.localdomain ceph-mon[294160]: pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 15 MiB/s wr, 116 op/s
Feb 23 09:57:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:57:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:57:37 np0005626463.localdomain podman[314541]: 2026-02-23 09:57:37.918618634 +0000 UTC m=+0.080766936 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 23 09:57:37 np0005626463.localdomain podman[314541]: 2026-02-23 09:57:37.963554833 +0000 UTC m=+0.125703135 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 23 09:57:37 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:57:38 np0005626463.localdomain podman[314542]: 2026-02-23 09:57:37.965827193 +0000 UTC m=+0.123778095 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:57:38 np0005626463.localdomain podman[314542]: 2026-02-23 09:57:38.046016261 +0000 UTC m=+0.203967263 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:57:38 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:57:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:38.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:39.005 2 INFO neutron.agent.securitygroups_rpc [None req-4377d041-aa51-4800-a96f-871d02773dd7 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']
Feb 23 09:57:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:39.369 2 INFO neutron.agent.securitygroups_rpc [None req-0a6c1cd1-12af-4dd6-b394-41e125fa511a 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['5db8e770-276e-4b00-beb9-c97310b59e62']
Feb 23 09:57:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:57:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:57:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:57:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 23 09:57:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:57:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19287 "" "Go-http-client/1.1"
Feb 23 09:57:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:39.752 2 INFO neutron.agent.securitygroups_rpc [None req-bb9f0bb2-0d36-4f02-872c-c348f9c20cb9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['5db8e770-276e-4b00-beb9-c97310b59e62']
Feb 23 09:57:40 np0005626463.localdomain ceph-mon[294160]: pgmap v245: 177 pgs: 177 active+clean; 145 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 8.0 MiB/s wr, 96 op/s
Feb 23 09:57:40 np0005626463.localdomain dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 0 addresses
Feb 23 09:57:40 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host
Feb 23 09:57:40 np0005626463.localdomain dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts
Feb 23 09:57:40 np0005626463.localdomain podman[314605]: 2026-02-23 09:57:40.348846108 +0000 UTC m=+0.056730684 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:57:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:40.406 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:57:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:40Z|00222|binding|INFO|Releasing lport b3ce4649-e63b-416c-91b2-f98efa2cafaa from this chassis (sb_readonly=0)
Feb 23 09:57:40 np0005626463.localdomain kernel: device tapb3ce4649-e6 left promiscuous mode
Feb 23 09:57:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:40Z|00223|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa down in Southbound
Feb 23 09:57:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:40.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:40.529 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84e01b3-7a51-49d7-acad-f20a75f6eb9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=b3ce4649-e63b-416c-91b2-f98efa2cafaa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:40.531 163572 INFO neutron.agent.ovn.metadata.agent [-] Port b3ce4649-e63b-416c-91b2-f98efa2cafaa in datapath bc69bb6e-6101-4390-bb1b-15e16ba6649d unbound from our chassis
Feb 23 09:57:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:40.534 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc69bb6e-6101-4390-bb1b-15e16ba6649d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:40.536 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:40.536 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddbd6da-e2df-4132-a083-423c772717e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:40Z|00224|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:57:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:40.589 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:41.069 2 INFO neutron.agent.securitygroups_rpc [None req-0a45a65f-615e-41b1-9ed5-950f0c0558fe 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:41 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:41.655 2 INFO neutron.agent.securitygroups_rpc [None req-e4560e49-35cf-47a1-b71e-2f4b224de7c3 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 do_prune osdmap full prune enabled
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e132 e132: 6 total, 6 up, 6 in
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in
Feb 23 09:57:42 np0005626463.localdomain dnsmasq[314412]: exiting on receipt of SIGTERM
Feb 23 09:57:42 np0005626463.localdomain podman[314642]: 2026-02-23 09:57:42.077034632 +0000 UTC m=+0.052614537 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:57:42 np0005626463.localdomain systemd[1]: libpod-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope: Deactivated successfully.
Feb 23 09:57:42 np0005626463.localdomain podman[314654]: 2026-02-23 09:57:42.123511667 +0000 UTC m=+0.034735903 container died 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:57:42 np0005626463.localdomain systemd[1]: tmp-crun.PZaKFQ.mount: Deactivated successfully.
Feb 23 09:57:42 np0005626463.localdomain podman[314654]: 2026-02-23 09:57:42.149518601 +0000 UTC m=+0.060742827 container cleanup 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 09:57:42 np0005626463.localdomain systemd[1]: libpod-conmon-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope: Deactivated successfully.
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 795 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 7.1 MiB/s wr, 94 op/s
Feb 23 09:57:42 np0005626463.localdomain ceph-mon[294160]: osdmap e132: 6 total, 6 up, 6 in
Feb 23 09:57:42 np0005626463.localdomain podman[314656]: 2026-02-23 09:57:42.238856571 +0000 UTC m=+0.142361960 container remove 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:57:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:42.337 2 INFO neutron.agent.securitygroups_rpc [None req-602c32a6-283b-418b-b086-4a732e23eda9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:42 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:42.613 265541 INFO neutron.agent.dhcp.agent [None req-57e28c15-8e1b-4ab2-9b60-28485b2c37cb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:42.776 2 INFO neutron.agent.securitygroups_rpc [None req-dd2fa8de-dc79-4abe-8c24-c96dc8537d15 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:42 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:57:42 np0005626463.localdomain podman[314680]: 2026-02-23 09:57:42.878160553 +0000 UTC m=+0.079072435 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 23 09:57:42 np0005626463.localdomain podman[314680]: 2026-02-23 09:57:42.893366533 +0000 UTC m=+0.094278405 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 23 09:57:42 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:57:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:43.058 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:43.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-06e3421c55fa244bcd15bc7912f7bbd56b819c3903bda310ddc27d1afd8ce870-merged.mount: Deactivated successfully.
Feb 23 09:57:43 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424-userdata-shm.mount: Deactivated successfully.
Feb 23 09:57:43 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dbc69bb6e\x2d6101\x2d4390\x2dbb1b\x2d15e16ba6649d.mount: Deactivated successfully.
Feb 23 09:57:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:57:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:57:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:57:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:57:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:57:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:57:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:43.639 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:43 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:43.672 2 INFO neutron.agent.securitygroups_rpc [None req-5c824a03-c3ca-4f5e-acd4-18b4108691ee 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:44 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:44.203 2 INFO neutron.agent.securitygroups_rpc [None req-9496869f-6aca-4c6c-85b6-7d3756f27d34 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']
Feb 23 09:57:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e132 do_prune osdmap full prune enabled
Feb 23 09:57:44 np0005626463.localdomain ceph-mon[294160]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 795 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 6.2 MiB/s wr, 82 op/s
Feb 23 09:57:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e133 e133: 6 total, 6 up, 6 in
Feb 23 09:57:44 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in
Feb 23 09:57:44 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:44.983 2 INFO neutron.agent.securitygroups_rpc [None req-18778c88-b1b9-4b6b-be87-89332e5bf54d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['15676482-e837-4bed-9cab-0aada6b790b9']
Feb 23 09:57:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e133 do_prune osdmap full prune enabled
Feb 23 09:57:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 e134: 6 total, 6 up, 6 in
Feb 23 09:57:45 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in
Feb 23 09:57:45 np0005626463.localdomain ceph-mon[294160]: osdmap e133: 6 total, 6 up, 6 in
Feb 23 09:57:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:45.718 2 INFO neutron.agent.securitygroups_rpc [None req-d9005b4c-4b89-4219-884c-0c4473a3114a 0bcd1a517bf5477491d448b5d8ebf7eb ef475d924469485f883dd5a9d719a22d - - default default] Security group rule updated ['3ef9048d-1c37-421d-bb50-73975b08bdfd']
Feb 23 09:57:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:57:45 np0005626463.localdomain podman[314699]: 2026-02-23 09:57:45.905262026 +0000 UTC m=+0.087154023 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 09:57:45 np0005626463.localdomain podman[314699]: 2026-02-23 09:57:45.935176761 +0000 UTC m=+0.117068748 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:45 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:57:46 np0005626463.localdomain ceph-mon[294160]: pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 795 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 639 B/s wr, 6 op/s
Feb 23 09:57:46 np0005626463.localdomain ceph-mon[294160]: osdmap e134: 6 total, 6 up, 6 in
Feb 23 09:57:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:47.122 2 INFO neutron.agent.securitygroups_rpc [None req-16ac4a4c-9b36-487b-bcc5-bd14fe4ab634 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 do_prune osdmap full prune enabled
Feb 23 09:57:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e135 e135: 6 total, 6 up, 6 in
Feb 23 09:57:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in
Feb 23 09:57:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:47.529 2 INFO neutron.agent.securitygroups_rpc [None req-28134d87-6144-4ae7-b9ec-7045d33170e4 4e19ac6dec8e40fbad0c3f681ec14665 6aadd525d3dd402cb701922115d00291 - - default default] Security group member updated ['a015e445-a8f1-4c73-9375-43b03b806b24']
Feb 23 09:57:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:47.590 265541 INFO neutron.agent.linux.ip_lib [None req-50dfad2c-221c-4076-8f93-d22270139bf1 - - - - - -] Device tap9fe0c543-bd cannot be used as it has no MAC address
Feb 23 09:57:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:47.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:47 np0005626463.localdomain kernel: device tap9fe0c543-bd entered promiscuous mode
Feb 23 09:57:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:47Z|00225|binding|INFO|Claiming lport 9fe0c543-bd4c-4645-bfe3-41c26546041f for this chassis.
Feb 23 09:57:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:47Z|00226|binding|INFO|9fe0c543-bd4c-4645-bfe3-41c26546041f: Claiming unknown
Feb 23 09:57:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:47.627 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:47 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840667.6297] manager: (tap9fe0c543-bd): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Feb 23 09:57:47 np0005626463.localdomain systemd-udevd[314727]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:47.635 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d02ccbf-9c2f-41fe-937e-ed56e04b90fb, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=9fe0c543-bd4c-4645-bfe3-41c26546041f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:47.637 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 9fe0c543-bd4c-4645-bfe3-41c26546041f in datapath 003df3c3-8c3a-4476-9564-6a5246acd7a3 bound to our chassis
Feb 23 09:57:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:47.639 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5306cfa7-96e5-4d20-b7cc-0e879b95e6f5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:57:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:47.640 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003df3c3-8c3a-4476-9564-6a5246acd7a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:47.640 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c0aeb981-7e96-42a2-a695-36ea1c9215d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:47Z|00227|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f ovn-installed in OVS
Feb 23 09:57:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:47Z|00228|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f up in Southbound
Feb 23 09:57:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:47.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device
Feb 23 09:57:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:47.714 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:47.741 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain ceph-mon[294160]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 796 MiB used, 41 GiB / 42 GiB avail; 853 B/s rd, 1023 B/s wr, 2 op/s
Feb 23 09:57:48 np0005626463.localdomain ceph-mon[294160]: osdmap e135: 6 total, 6 up, 6 in
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:48.167 265541 INFO neutron.agent.linux.ip_lib [None req-fdd09cea-58fb-4227-ad7f-785b75fd49b9 - - - - - -] Device tap908116f5-e2 cannot be used as it has no MAC address
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.191 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain kernel: device tap908116f5-e2 entered promiscuous mode
Feb 23 09:57:48 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840668.1972] manager: (tap908116f5-e2): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Feb 23 09:57:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:48Z|00229|binding|INFO|Claiming lport 908116f5-e230-40e0-818e-5844b37f3a2c for this chassis.
Feb 23 09:57:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:48Z|00230|binding|INFO|908116f5-e230-40e0-818e-5844b37f3a2c: Claiming unknown
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.212 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8df8eb84-f846-4c21-a0a4-93d19730bc64, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=908116f5-e230-40e0-818e-5844b37f3a2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.214 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 908116f5-e230-40e0-818e-5844b37f3a2c in datapath c843867f-296c-42fa-9f8c-55712f0f7c56 bound to our chassis
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.215 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c843867f-296c-42fa-9f8c-55712f0f7c56 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.216 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b29065-bb4e-47cb-aaf2-0ed8f7fde857]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:48Z|00231|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c ovn-installed in OVS
Feb 23 09:57:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:48Z|00232|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c up in Southbound
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.236 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain systemd-journald[47710]: Data hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Feb 23 09:57:48 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 09:57:48 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:48.300 2 INFO neutron.agent.securitygroups_rpc [None req-16297440-fc3c-47fa-be29-1571a923d41c b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:48.327 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:48 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:57:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:57:48 np0005626463.localdomain podman[314833]: 
Feb 23 09:57:48 np0005626463.localdomain podman[314833]: 2026-02-23 09:57:48.812460565 +0000 UTC m=+0.095519642 container create bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope.
Feb 23 09:57:48 np0005626463.localdomain podman[314833]: 2026-02-23 09:57:48.767850007 +0000 UTC m=+0.050909124 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:48 np0005626463.localdomain systemd[1]: tmp-crun.sxckDD.mount: Deactivated successfully.
Feb 23 09:57:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:48 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef8e07bef3889686b85bfa5eac244cc48bec7dc7da70c4883c2434cba185c80b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:48 np0005626463.localdomain podman[314833]: 2026-02-23 09:57:48.90811565 +0000 UTC m=+0.191174717 container init bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216)
Feb 23 09:57:48 np0005626463.localdomain podman[314833]: 2026-02-23 09:57:48.917257713 +0000 UTC m=+0.200316780 container start bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:57:48 np0005626463.localdomain dnsmasq[314857]: started, version 2.85 cachesize 150
Feb 23 09:57:48 np0005626463.localdomain dnsmasq[314857]: DNS service limited to local subnets
Feb 23 09:57:48 np0005626463.localdomain dnsmasq[314857]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:48 np0005626463.localdomain dnsmasq[314857]: warning: no upstream servers configured
Feb 23 09:57:48 np0005626463.localdomain dnsmasq-dhcp[314857]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:57:48 np0005626463.localdomain dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 0 addresses
Feb 23 09:57:48 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host
Feb 23 09:57:48 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts
Feb 23 09:57:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:48.973 265541 INFO neutron.agent.dhcp.agent [None req-9f7599b3-6ad5-4943-8708-f6fdebdfcb2c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:47Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292469a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292468e0>], id=67870819-392f-4850-8a0e-4545b8c9e49b, ip_allocation=immediate, mac_address=fa:16:3e:6d:29:d1, name=tempest-RoutersTest-9259871, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:44Z, description=, dns_domain=, id=003df3c3-8c3a-4476-9564-6a5246acd7a3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1090077587, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11935, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1789, status=ACTIVE, subnets=['21a37bfd-fe0e-4035-abac-601c14169356'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:45Z, vlan_transparent=None, network_id=003df3c3-8c3a-4476-9564-6a5246acd7a3, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a015e445-a8f1-4c73-9375-43b03b806b24'], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:47Z on network 003df3c3-8c3a-4476-9564-6a5246acd7a3
Feb 23 09:57:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e135 do_prune osdmap full prune enabled
Feb 23 09:57:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:49.063 265541 INFO neutron.agent.dhcp.agent [None req-a52d8f5a-876f-4f5b-96e0-52e204ec051f - - - - - -] DHCP configuration for ports {'2d82a807-e8c0-4bc7-82c9-ebbf52a48105'} is completed
Feb 23 09:57:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 e136: 6 total, 6 up, 6 in
Feb 23 09:57:49 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 1 addresses
Feb 23 09:57:49 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host
Feb 23 09:57:49 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts
Feb 23 09:57:49 np0005626463.localdomain podman[314907]: 2026-02-23 09:57:49.234198315 +0000 UTC m=+0.050487191 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 09:57:49 np0005626463.localdomain podman[314895]: 
Feb 23 09:57:49 np0005626463.localdomain podman[314895]: 2026-02-23 09:57:49.255286597 +0000 UTC m=+0.130906456 container create 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:57:49 np0005626463.localdomain systemd[1]: Started libpod-conmon-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope.
Feb 23 09:57:49 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:49 np0005626463.localdomain podman[314895]: 2026-02-23 09:57:49.211500053 +0000 UTC m=+0.087119952 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:49 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bec57338a9b4c2dc713f40ed6af99e9442fa24f4fd216de691c92203a3e570/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:49 np0005626463.localdomain podman[314895]: 2026-02-23 09:57:49.319453509 +0000 UTC m=+0.195073338 container init 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:57:49 np0005626463.localdomain podman[314895]: 2026-02-23 09:57:49.327834908 +0000 UTC m=+0.203454757 container start 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216)
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314933]: started, version 2.85 cachesize 150
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314933]: DNS service limited to local subnets
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314933]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314933]: warning: no upstream servers configured
Feb 23 09:57:49 np0005626463.localdomain dnsmasq-dhcp[314933]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:57:49 np0005626463.localdomain dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 0 addresses
Feb 23 09:57:49 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host
Feb 23 09:57:49 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts
Feb 23 09:57:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:49.545 265541 INFO neutron.agent.dhcp.agent [None req-78f02289-4e63-43d9-9067-891e262dc7a4 - - - - - -] DHCP configuration for ports {'67870819-392f-4850-8a0e-4545b8c9e49b', '8f12aa06-ee55-4f04-a11b-abfcbb418947'} is completed
Feb 23 09:57:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:49.584 2 INFO neutron.agent.securitygroups_rpc [None req-35663660-8b26-425d-b466-39f75aa64f72 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:50 np0005626463.localdomain ceph-mon[294160]: osdmap e136: 6 total, 6 up, 6 in
Feb 23 09:57:50 np0005626463.localdomain ceph-mon[294160]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 796 MiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Feb 23 09:57:50 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:50.371 2 INFO neutron.agent.securitygroups_rpc [None req-2cc4babb-57fc-4ffe-921b-a7d431fda5c5 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:51.096 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:47Z, description=, device_id=e9c5e1e3-ec18-4894-90a9-76e50883b72e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829bea880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829262d60>], id=67870819-392f-4850-8a0e-4545b8c9e49b, ip_allocation=immediate, mac_address=fa:16:3e:6d:29:d1, name=tempest-RoutersTest-9259871, network_id=003df3c3-8c3a-4476-9564-6a5246acd7a3, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['a015e445-a8f1-4c73-9375-43b03b806b24'], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:48Z on network 003df3c3-8c3a-4476-9564-6a5246acd7a3
Feb 23 09:57:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2069526124' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:57:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2069526124' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:57:51 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:51.259 2 INFO neutron.agent.securitygroups_rpc [None req-a96e8b20-0ccf-41bb-a529-34f8cbad2842 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:51 np0005626463.localdomain podman[314954]: 2026-02-23 09:57:51.318727256 +0000 UTC m=+0.060599682 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 09:57:51 np0005626463.localdomain dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 1 addresses
Feb 23 09:57:51 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host
Feb 23 09:57:51 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts
Feb 23 09:57:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:51.534 265541 INFO neutron.agent.dhcp.agent [None req-c91303f1-a104-464c-b45a-58b958b4263a - - - - - -] DHCP configuration for ports {'67870819-392f-4850-8a0e-4545b8c9e49b'} is completed
Feb 23 09:57:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:51.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 do_prune osdmap full prune enabled
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 e137: 6 total, 6 up, 6 in
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: pgmap v256: 177 pgs: 177 active+clean; 145 MiB data, 796 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.0 KiB/s wr, 56 op/s
Feb 23 09:57:52 np0005626463.localdomain ceph-mon[294160]: osdmap e137: 6 total, 6 up, 6 in
Feb 23 09:57:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:52.561 265541 INFO neutron.agent.linux.ip_lib [None req-2e7c7154-d0f6-4ae0-9881-fc135e57cc03 - - - - - -] Device tap471e704b-d2 cannot be used as it has no MAC address
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain kernel: device tap471e704b-d2 entered promiscuous mode
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.590 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840672.5922] manager: (tap471e704b-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Feb 23 09:57:52 np0005626463.localdomain systemd-udevd[314985]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:57:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:52Z|00233|binding|INFO|Claiming lport 471e704b-d2af-4ed9-bee8-b3da9da96eee for this chassis.
Feb 23 09:57:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:52Z|00234|binding|INFO|471e704b-d2af-4ed9-bee8-b3da9da96eee: Claiming unknown
Feb 23 09:57:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:52.606 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d673836f-3caa-47cd-a8d5-bae35d62c0ff, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=471e704b-d2af-4ed9-bee8-b3da9da96eee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:52.612 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 471e704b-d2af-4ed9-bee8-b3da9da96eee in datapath 53593c6a-4c2a-420a-9472-e7be0052fa39 bound to our chassis
Feb 23 09:57:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:52.614 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53593c6a-4c2a-420a-9472-e7be0052fa39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:57:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:52.614 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4f187005-63d5-42f2-b16d-539ef5d8f6ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.625 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:52Z|00235|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee ovn-installed in OVS
Feb 23 09:57:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:52Z|00236|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee up in Southbound
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.633 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap471e704b-d2: No such device
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:52.700 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:52 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:52.710 2 INFO neutron.agent.securitygroups_rpc [None req-6e03d4f3-0be2-400e-8da5-d3c25ea65d96 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:52 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:52.865 2 INFO neutron.agent.securitygroups_rpc [None req-36c16d45-719d-4f5a-b525-80c8ec71acee 4e19ac6dec8e40fbad0c3f681ec14665 6aadd525d3dd402cb701922115d00291 - - default default] Security group member updated ['a015e445-a8f1-4c73-9375-43b03b806b24']
Feb 23 09:57:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:53.077 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:53 np0005626463.localdomain systemd[1]: tmp-crun.YfzErZ.mount: Deactivated successfully.
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 0 addresses
Feb 23 09:57:53 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host
Feb 23 09:57:53 np0005626463.localdomain dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts
Feb 23 09:57:53 np0005626463.localdomain podman[315046]: 2026-02-23 09:57:53.130182573 +0000 UTC m=+0.074106470 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:57:53 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:53Z|00237|binding|INFO|Releasing lport 9fe0c543-bd4c-4645-bfe3-41c26546041f from this chassis (sb_readonly=0)
Feb 23 09:57:53 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:53Z|00238|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f down in Southbound
Feb 23 09:57:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:53.299 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:53 np0005626463.localdomain kernel: device tap9fe0c543-bd left promiscuous mode
Feb 23 09:57:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:53.309 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d02ccbf-9c2f-41fe-937e-ed56e04b90fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=9fe0c543-bd4c-4645-bfe3-41c26546041f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:57:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:53.311 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 9fe0c543-bd4c-4645-bfe3-41c26546041f in datapath 003df3c3-8c3a-4476-9564-6a5246acd7a3 unbound from our chassis
Feb 23 09:57:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:53.315 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003df3c3-8c3a-4476-9564-6a5246acd7a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:57:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:57:53.316 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef40639-3f1f-4798-8932-e713d58f1867]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:57:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:53.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:53 np0005626463.localdomain podman[315097]: 
Feb 23 09:57:53 np0005626463.localdomain podman[315097]: 2026-02-23 09:57:53.526461236 +0000 UTC m=+0.093695956 container create 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:53 np0005626463.localdomain systemd[1]: Started libpod-conmon-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope.
Feb 23 09:57:53 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:57:53 np0005626463.localdomain podman[315097]: 2026-02-23 09:57:53.480425344 +0000 UTC m=+0.047660094 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:57:53 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5440e0fd82ce5dacc9263097b3678f8d36f3eeeec3a08ac5ea46b12225532482/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:57:53 np0005626463.localdomain podman[315097]: 2026-02-23 09:57:53.591142355 +0000 UTC m=+0.158377055 container init 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 09:57:53 np0005626463.localdomain podman[315097]: 2026-02-23 09:57:53.599508043 +0000 UTC m=+0.166742743 container start 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[315116]: started, version 2.85 cachesize 150
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[315116]: DNS service limited to local subnets
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[315116]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[315116]: warning: no upstream servers configured
Feb 23 09:57:53 np0005626463.localdomain dnsmasq-dhcp[315116]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 09:57:53 np0005626463.localdomain dnsmasq[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/addn_hosts - 0 addresses
Feb 23 09:57:53 np0005626463.localdomain dnsmasq-dhcp[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/host
Feb 23 09:57:53 np0005626463.localdomain dnsmasq-dhcp[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/opts
Feb 23 09:57:54 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:54.028 265541 INFO neutron.agent.dhcp.agent [None req-89831f41-dbfb-4976-8fd0-345c397ca2d8 - - - - - -] DHCP configuration for ports {'f831d45b-9e7e-481e-9011-7f9545ba3cff'} is completed
Feb 23 09:57:54 np0005626463.localdomain ceph-mon[294160]: pgmap v258: 177 pgs: 177 active+clean; 145 MiB data, 796 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 2.2 KiB/s wr, 57 op/s
Feb 23 09:57:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:55.154 2 INFO neutron.agent.securitygroups_rpc [None req-518b7324-91f0-437a-87bd-ce34c1a3be1e e22ee96829d64023b04af5ccfdd0ab53 a3622447b13c4164ad418e851634e3b3 - - default default] Security group member updated ['b06f7d0b-a9fc-4c26-994a-bc68e12c2cf6']
Feb 23 09:57:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:55.219 2 INFO neutron.agent.securitygroups_rpc [None req-1ba87428-a084-4aaf-9c26-da4479389d22 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:55.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:57:55 np0005626463.localdomain podman[315135]: 2026-02-23 09:57:55.873007644 +0000 UTC m=+0.060645904 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:57:55 np0005626463.localdomain dnsmasq[314857]: exiting on receipt of SIGTERM
Feb 23 09:57:55 np0005626463.localdomain systemd[1]: libpod-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope: Deactivated successfully.
Feb 23 09:57:55 np0005626463.localdomain systemd[1]: tmp-crun.4AS6Cl.mount: Deactivated successfully.
Feb 23 09:57:55 np0005626463.localdomain podman[315141]: 2026-02-23 09:57:55.919976306 +0000 UTC m=+0.093197790 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:57:55 np0005626463.localdomain podman[315158]: 2026-02-23 09:57:55.95118514 +0000 UTC m=+0.061335876 container died bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:57:55 np0005626463.localdomain podman[315158]: 2026-02-23 09:57:55.990845235 +0000 UTC m=+0.100995931 container cleanup bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:57:55 np0005626463.localdomain systemd[1]: libpod-conmon-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope: Deactivated successfully.
Feb 23 09:57:56 np0005626463.localdomain podman[315160]: 2026-02-23 09:57:56.009708338 +0000 UTC m=+0.112655771 container remove bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:57:56 np0005626463.localdomain podman[315141]: 2026-02-23 09:57:56.022683489 +0000 UTC m=+0.195904933 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:57:56 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:57:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:56.068 265541 INFO neutron.agent.dhcp.agent [None req-2dfe25f0-f3d6-4e3a-b631-7966a74d8b02 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:56 np0005626463.localdomain ceph-mon[294160]: pgmap v259: 177 pgs: 177 active+clean; 145 MiB data, 796 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.7 KiB/s wr, 44 op/s
Feb 23 09:57:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:57:56.783 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:57:56 np0005626463.localdomain systemd[1]: tmp-crun.M72uVd.mount: Deactivated successfully.
Feb 23 09:57:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-ef8e07bef3889686b85bfa5eac244cc48bec7dc7da70c4883c2434cba185c80b-merged.mount: Deactivated successfully.
Feb 23 09:57:56 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79-userdata-shm.mount: Deactivated successfully.
Feb 23 09:57:56 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d003df3c3\x2d8c3a\x2d4476\x2d9564\x2d6a5246acd7a3.mount: Deactivated successfully.
Feb 23 09:57:57 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:57.022 2 INFO neutron.agent.securitygroups_rpc [None req-758250a6-9620-43e4-8e85-56ce0e87f261 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:57:57 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:57:57Z|00239|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:57:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:57.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:57:57 np0005626463.localdomain podman[315194]: 2026-02-23 09:57:57.905863551 +0000 UTC m=+0.079503118 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64)
Feb 23 09:57:57 np0005626463.localdomain podman[315194]: 2026-02-23 09:57:57.919479952 +0000 UTC m=+0.093119479 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 09:57:57 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:57:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 do_prune osdmap full prune enabled
Feb 23 09:57:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 e138: 6 total, 6 up, 6 in
Feb 23 09:57:58 np0005626463.localdomain ceph-mon[294160]: pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 1.7 KiB/s wr, 43 op/s
Feb 23 09:57:58 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in
Feb 23 09:57:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:57:58.113 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:57:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:58.662 2 INFO neutron.agent.securitygroups_rpc [None req-90d7272c-2a16-4c66-834f-ae9e72b06d5d b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:57:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:57:58.724 2 INFO neutron.agent.securitygroups_rpc [None req-92d71163-849e-49a8-9e78-04255fc35661 e22ee96829d64023b04af5ccfdd0ab53 a3622447b13c4164ad418e851634e3b3 - - default default] Security group member updated ['b06f7d0b-a9fc-4c26-994a-bc68e12c2cf6']
Feb 23 09:57:59 np0005626463.localdomain ceph-mon[294160]: osdmap e138: 6 total, 6 up, 6 in
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 1.7 KiB/s rd, 255 B/s wr, 3 op/s
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:00 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3885513677' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3885513677' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:01.242 2 INFO neutron.agent.securitygroups_rpc [None req-28ec64b1-e1b5-4dd6-940d-987b4dc3aa1e b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:58:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:01.946 2 INFO neutron.agent.securitygroups_rpc [None req-1398d805-3ae6-40de-8906-b5cfcdc73dab b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 do_prune osdmap full prune enabled
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 e139: 6 total, 6 up, 6 in
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in
Feb 23 09:58:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:02Z|00240|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 1.5 KiB/s wr, 35 op/s
Feb 23 09:58:02 np0005626463.localdomain ceph-mon[294160]: osdmap e139: 6 total, 6 up, 6 in
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:02.199 265541 INFO neutron.agent.linux.ip_lib [None req-a9c48374-050a-46cf-8cb2-df1ce0cb8828 - - - - - -] Device tap756bf056-72 cannot be used as it has no MAC address
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.220 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain kernel: device tap756bf056-72 entered promiscuous mode
Feb 23 09:58:02 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840682.2283] manager: (tap756bf056-72): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Feb 23 09:58:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:02Z|00241|binding|INFO|Claiming lport 756bf056-7233-47b6-8872-465c3e350d26 for this chassis.
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.229 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:02Z|00242|binding|INFO|756bf056-7233-47b6-8872-465c3e350d26: Claiming unknown
Feb 23 09:58:02 np0005626463.localdomain systemd-udevd[315225]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:58:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:02.247 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43874ae-5b5b-4695-9210-e284b1e04551, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=756bf056-7233-47b6-8872-465c3e350d26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:02.249 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 756bf056-7233-47b6-8872-465c3e350d26 in datapath 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd bound to our chassis
Feb 23 09:58:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:02.251 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:02.252 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cff029-ce32-4a9a-91e8-28550280e4a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:02Z|00243|binding|INFO|Setting lport 756bf056-7233-47b6-8872-465c3e350d26 ovn-installed in OVS
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:02Z|00244|binding|INFO|Setting lport 756bf056-7233-47b6-8872-465c3e350d26 up in Southbound
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap756bf056-72: No such device
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.302 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:02.327 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:02.794 2 INFO neutron.agent.securitygroups_rpc [None req-7618cb81-6bfd-47c7-b5cf-c5e505798fed b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']
Feb 23 09:58:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:03.116 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:03 np0005626463.localdomain podman[315296]: 
Feb 23 09:58:03 np0005626463.localdomain podman[315296]: 2026-02-23 09:58:03.133269796 +0000 UTC m=+0.100605380 container create 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 23 09:58:03 np0005626463.localdomain podman[315296]: 2026-02-23 09:58:03.084039564 +0000 UTC m=+0.051375178 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:58:03 np0005626463.localdomain systemd[1]: Started libpod-conmon-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope.
Feb 23 09:58:03 np0005626463.localdomain systemd[1]: tmp-crun.KE978s.mount: Deactivated successfully.
Feb 23 09:58:03 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:58:03 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52c67337069f1c964652cb1231caff2024e0c511d433352b5a89e8eb28d84b9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:58:03 np0005626463.localdomain podman[315296]: 2026-02-23 09:58:03.22791591 +0000 UTC m=+0.195251494 container init 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:03 np0005626463.localdomain podman[315296]: 2026-02-23 09:58:03.240578131 +0000 UTC m=+0.207913715 container start 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[315315]: started, version 2.85 cachesize 150
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[315315]: DNS service limited to local subnets
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[315315]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[315315]: warning: no upstream servers configured
Feb 23 09:58:03 np0005626463.localdomain dnsmasq-dhcp[315315]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/addn_hosts - 0 addresses
Feb 23 09:58:03 np0005626463.localdomain dnsmasq-dhcp[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/host
Feb 23 09:58:03 np0005626463.localdomain dnsmasq-dhcp[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/opts
Feb 23 09:58:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.348 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:02Z, description=, device_id=719d7925-a22d-417d-9a79-c8610838b916, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292366d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829246d00>], id=1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3, ip_allocation=immediate, mac_address=fa:16:3e:75:1a:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:45Z, description=, dns_domain=, id=c843867f-296c-42fa-9f8c-55712f0f7c56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1442624154, port_security_enabled=True, project_id=acbc03f0564045b8857e1689cfa4a66d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35057, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1819, status=ACTIVE, subnets=['394dad53-0de3-4125-ba86-854949b31577'], tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:57:46Z, vlan_transparent=None, network_id=c843867f-296c-42fa-9f8c-55712f0f7c56, port_security_enabled=False, project_id=acbc03f0564045b8857e1689cfa4a66d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1940, status=DOWN, tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:58:02Z on network c843867f-296c-42fa-9f8c-55712f0f7c56
Feb 23 09:58:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.427 265541 INFO neutron.agent.dhcp.agent [None req-0652b052-b4c2-454d-a519-9d07e54daee5 - - - - - -] DHCP configuration for ports {'b99e0923-3e0d-4d50-bd39-8a10af9a0874'} is completed
Feb 23 09:58:03 np0005626463.localdomain dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 1 addresses
Feb 23 09:58:03 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host
Feb 23 09:58:03 np0005626463.localdomain podman[315334]: 2026-02-23 09:58:03.538762894 +0000 UTC m=+0.057513228 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:03 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts
Feb 23 09:58:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.773 265541 INFO neutron.agent.dhcp.agent [None req-83a05713-f416-409a-b6dd-4ca92f3b3057 - - - - - -] DHCP configuration for ports {'1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3'} is completed
Feb 23 09:58:04 np0005626463.localdomain ceph-mon[294160]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.4 KiB/s wr, 65 op/s
Feb 23 09:58:04 np0005626463.localdomain sshd[315354]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:58:04 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:04.885 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:02Z, description=, device_id=719d7925-a22d-417d-9a79-c8610838b916, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282936cd30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829bf9c40>], id=1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3, ip_allocation=immediate, mac_address=fa:16:3e:75:1a:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:45Z, description=, dns_domain=, id=c843867f-296c-42fa-9f8c-55712f0f7c56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1442624154, port_security_enabled=True, project_id=acbc03f0564045b8857e1689cfa4a66d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35057, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1819, status=ACTIVE, subnets=['394dad53-0de3-4125-ba86-854949b31577'], tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:57:46Z, vlan_transparent=None, network_id=c843867f-296c-42fa-9f8c-55712f0f7c56, port_security_enabled=False, project_id=acbc03f0564045b8857e1689cfa4a66d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1940, status=DOWN, tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:58:02Z on network c843867f-296c-42fa-9f8c-55712f0f7c56
Feb 23 09:58:05 np0005626463.localdomain podman[315374]: 2026-02-23 09:58:05.084775128 +0000 UTC m=+0.058216189 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:58:05 np0005626463.localdomain systemd[1]: tmp-crun.YXd9kp.mount: Deactivated successfully.
Feb 23 09:58:05 np0005626463.localdomain dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 1 addresses
Feb 23 09:58:05 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host
Feb 23 09:58:05 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts
Feb 23 09:58:05 np0005626463.localdomain sshd[315354]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:58:05 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:05.336 265541 INFO neutron.agent.dhcp.agent [None req-26abc0b7-b08d-45f7-9c29-51dcd8441aa9 - - - - - -] DHCP configuration for ports {'1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3'} is completed
Feb 23 09:58:06 np0005626463.localdomain ceph-mon[294160]: pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.2 KiB/s wr, 64 op/s
Feb 23 09:58:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:06.389 2 INFO neutron.agent.securitygroups_rpc [None req-f1c412d9-695e-47a0-9d6e-ba30fd3bc526 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:07 np0005626463.localdomain dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 0 addresses
Feb 23 09:58:07 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host
Feb 23 09:58:07 np0005626463.localdomain podman[315411]: 2026-02-23 09:58:07.142255406 +0000 UTC m=+0.058726686 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:07 np0005626463.localdomain dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts
Feb 23 09:58:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:07.157 2 INFO neutron.agent.securitygroups_rpc [None req-a9451f2f-78ed-41db-9d83-c983c36607eb 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:07.279 2 INFO neutron.agent.securitygroups_rpc [None req-c82f4f13-a6c6-4dfc-aae6-5892f71ca6d5 8ff2abb777c74a6dbae4721d46f0d17a 182b0ebb06754cfab10ebabcdf7056ed - - default default] Security group member updated ['c029b069-aec5-44a4-9af0-e58cbf64895c']
Feb 23 09:58:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:07.296 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:07 np0005626463.localdomain kernel: device tap908116f5-e2 left promiscuous mode
Feb 23 09:58:07 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:07Z|00245|binding|INFO|Releasing lport 908116f5-e230-40e0-818e-5844b37f3a2c from this chassis (sb_readonly=0)
Feb 23 09:58:07 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:07Z|00246|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c down in Southbound
Feb 23 09:58:07 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:07.309 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8df8eb84-f846-4c21-a0a4-93d19730bc64, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=908116f5-e230-40e0-818e-5844b37f3a2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:07 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:07.311 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 908116f5-e230-40e0-818e-5844b37f3a2c in datapath c843867f-296c-42fa-9f8c-55712f0f7c56 unbound from our chassis
Feb 23 09:58:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:07.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:07 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:07.313 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c843867f-296c-42fa-9f8c-55712f0f7c56 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:07 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:07.314 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[130a033a-f4b0-405c-b683-60659a6f8f91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:08 np0005626463.localdomain dnsmasq[315315]: exiting on receipt of SIGTERM
Feb 23 09:58:08 np0005626463.localdomain podman[315448]: 2026-02-23 09:58:08.012083439 +0000 UTC m=+0.060374845 container kill 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: libpod-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:58:08 np0005626463.localdomain ceph-mon[294160]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 2.0 KiB/s wr, 57 op/s
Feb 23 09:58:08 np0005626463.localdomain podman[315461]: 2026-02-23 09:58:08.091904496 +0000 UTC m=+0.064852915 container died 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:58:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:08.119 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:58:08 np0005626463.localdomain podman[315469]: 2026-02-23 09:58:08.154464088 +0000 UTC m=+0.106113369 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1-userdata-shm.mount: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-52c67337069f1c964652cb1231caff2024e0c511d433352b5a89e8eb28d84b9a-merged.mount: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain podman[315469]: 2026-02-23 09:58:08.238102433 +0000 UTC m=+0.189751694 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 09:58:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:08.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:08.239 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d5af4bb4-2866-499e-8695-7ba6d053f969 with type ""
Feb 23 09:58:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:08.242 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43874ae-5b5b-4695-9210-e284b1e04551, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=756bf056-7233-47b6-8872-465c3e350d26) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:08Z|00247|binding|INFO|Removing iface tap756bf056-72 ovn-installed in OVS
Feb 23 09:58:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:08Z|00248|binding|INFO|Removing lport 756bf056-7233-47b6-8872-465c3e350d26 ovn-installed in OVS
Feb 23 09:58:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:08.246 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 756bf056-7233-47b6-8872-465c3e350d26 in datapath 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd unbound from our chassis
Feb 23 09:58:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:08.249 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:08.250 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:08 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:08.250 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8d9e11-606a-4dac-9453-bbe5857f5a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain podman[315461]: 2026-02-23 09:58:08.293833644 +0000 UTC m=+0.266782023 container cleanup 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: libpod-conmon-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain podman[315463]: 2026-02-23 09:58:08.318556769 +0000 UTC m=+0.282517130 container remove 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:08.334 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:08 np0005626463.localdomain kernel: device tap756bf056-72 left promiscuous mode
Feb 23 09:58:08 np0005626463.localdomain podman[315497]: 2026-02-23 09:58:08.243144929 +0000 UTC m=+0.111389793 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:58:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:08.351 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:08 np0005626463.localdomain podman[315497]: 2026-02-23 09:58:08.376389265 +0000 UTC m=+0.244634169 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:58:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:08.386 265541 INFO neutron.agent.dhcp.agent [None req-4bd86484-d733-437f-acc1-bf3cc7a70cbc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:08 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:58:08 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:08.660 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:09 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:09Z|00249|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:58:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:09.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:09 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d74bfca51\x2d6a6b\x2d4fa1\x2dbb91\x2d4bfc9a96dacd.mount: Deactivated successfully.
Feb 23 09:58:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:58:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:58:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:58:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160715 "" "Go-http-client/1.1"
Feb 23 09:58:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:58:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19762 "" "Go-http-client/1.1"
Feb 23 09:58:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:09.760 2 INFO neutron.agent.securitygroups_rpc [None req-32f4623b-5152-4fb4-8665-550d3831cd54 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 do_prune osdmap full prune enabled
Feb 23 09:58:10 np0005626463.localdomain ceph-mon[294160]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 1.8 KiB/s wr, 51 op/s
Feb 23 09:58:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 e140: 6 total, 6 up, 6 in
Feb 23 09:58:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in
Feb 23 09:58:10 np0005626463.localdomain podman[315554]: 2026-02-23 09:58:10.299539032 +0000 UTC m=+0.069734816 container kill 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 09:58:10 np0005626463.localdomain dnsmasq[315116]: exiting on receipt of SIGTERM
Feb 23 09:58:10 np0005626463.localdomain systemd[1]: libpod-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope: Deactivated successfully.
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.351 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.352 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:58:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:10.384 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:10 np0005626463.localdomain podman[315569]: 2026-02-23 09:58:10.390458531 +0000 UTC m=+0.069697654 container died 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 09:58:10 np0005626463.localdomain systemd[1]: tmp-crun.390MTa.mount: Deactivated successfully.
Feb 23 09:58:10 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf-userdata-shm.mount: Deactivated successfully.
Feb 23 09:58:10 np0005626463.localdomain podman[315569]: 2026-02-23 09:58:10.435645457 +0000 UTC m=+0.114884550 container remove 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:58:10 np0005626463.localdomain systemd[1]: libpod-conmon-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope: Deactivated successfully.
Feb 23 09:58:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:10.452 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:10 np0005626463.localdomain kernel: device tap471e704b-d2 left promiscuous mode
Feb 23 09:58:10 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:10Z|00250|binding|INFO|Releasing lport 471e704b-d2af-4ed9-bee8-b3da9da96eee from this chassis (sb_readonly=0)
Feb 23 09:58:10 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:10Z|00251|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee down in Southbound
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.461 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d673836f-3caa-47cd-a8d5-bae35d62c0ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=471e704b-d2af-4ed9-bee8-b3da9da96eee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.463 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 471e704b-d2af-4ed9-bee8-b3da9da96eee in datapath 53593c6a-4c2a-420a-9472-e7be0052fa39 unbound from our chassis
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.465 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53593c6a-4c2a-420a-9472-e7be0052fa39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:10.465 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d67b0d9a-d0bb-4ae4-8d79-611fd22147b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:10.483 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:10 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:10.681 265541 INFO neutron.agent.dhcp.agent [None req-5b205fac-0dac-4ec3-86e3-6be5a69902db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:10 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:10.984 2 INFO neutron.agent.securitygroups_rpc [None req-19ae9284-14e3-4a20-834d-20dede799690 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:11 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:11.126 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:11 np0005626463.localdomain ceph-mon[294160]: osdmap e140: 6 total, 6 up, 6 in
Feb 23 09:58:11 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-5440e0fd82ce5dacc9263097b3678f8d36f3eeeec3a08ac5ea46b12225532482-merged.mount: Deactivated successfully.
Feb 23 09:58:11 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d53593c6a\x2d4c2a\x2d420a\x2d9472\x2de7be0052fa39.mount: Deactivated successfully.
Feb 23 09:58:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:11Z|00252|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:58:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:11.359 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:12.024 265541 INFO neutron.agent.linux.ip_lib [None req-acbd2cfd-da9a-435e-8746-88eb3178d90b - - - - - -] Device tap591e24ae-b5 cannot be used as it has no MAC address
Feb 23 09:58:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:12.095 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain kernel: device tap591e24ae-b5 entered promiscuous mode
Feb 23 09:58:12 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840692.1031] manager: (tap591e24ae-b5): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Feb 23 09:58:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:12.102 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:12Z|00253|binding|INFO|Claiming lport 591e24ae-b5aa-4de2-bb90-f63788f17656 for this chassis.
Feb 23 09:58:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:12Z|00254|binding|INFO|591e24ae-b5aa-4de2-bb90-f63788f17656: Claiming unknown
Feb 23 09:58:12 np0005626463.localdomain systemd-udevd[315604]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:58:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:12.121 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '983d362fe1064ddd8f80d65a731f1168', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f1210f-d7d4-4dc7-9c80-9ac59d666074, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=591e24ae-b5aa-4de2-bb90-f63788f17656) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:12.123 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 591e24ae-b5aa-4de2-bb90-f63788f17656 in datapath 59b1761b-6159-4589-b0ab-d24692c6be4a bound to our chassis
Feb 23 09:58:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:12.127 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 59b1761b-6159-4589-b0ab-d24692c6be4a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:12.129 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d6568e78-7528-4be6-8f7f-fbbfad844393]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:12.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:12Z|00255|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 ovn-installed in OVS
Feb 23 09:58:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:12Z|00256|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 up in Southbound
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap591e24ae-b5: No such device
Feb 23 09:58:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:12.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:12.209 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 do_prune osdmap full prune enabled
Feb 23 09:58:12 np0005626463.localdomain ceph-mon[294160]: pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 673 B/s wr, 2 op/s
Feb 23 09:58:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 e141: 6 total, 6 up, 6 in
Feb 23 09:58:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in
Feb 23 09:58:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:12.353 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:58:12 np0005626463.localdomain dnsmasq[314933]: exiting on receipt of SIGTERM
Feb 23 09:58:12 np0005626463.localdomain podman[315670]: 2026-02-23 09:58:12.837295506 +0000 UTC m=+0.055628209 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:58:12 np0005626463.localdomain systemd[1]: libpod-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope: Deactivated successfully.
Feb 23 09:58:12 np0005626463.localdomain podman[315683]: 2026-02-23 09:58:12.903069198 +0000 UTC m=+0.052706059 container died 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: tmp-crun.fXp0zA.mount: Deactivated successfully.
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:58:13 np0005626463.localdomain podman[315683]: 2026-02-23 09:58:13.087913279 +0000 UTC m=+0.237550080 container cleanup 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: libpod-conmon-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope: Deactivated successfully.
Feb 23 09:58:13 np0005626463.localdomain podman[315685]: 2026-02-23 09:58:13.11027948 +0000 UTC m=+0.250722087 container remove 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:13.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:13 np0005626463.localdomain podman[315728]: 2026-02-23 09:58:13.182919934 +0000 UTC m=+0.113567729 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 09:58:13 np0005626463.localdomain podman[315728]: 2026-02-23 09:58:13.193085799 +0000 UTC m=+0.123733594 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:58:13 np0005626463.localdomain podman[315744]: 
Feb 23 09:58:13 np0005626463.localdomain podman[315744]: 2026-02-23 09:58:13.249630465 +0000 UTC m=+0.075153523 container create 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: Started libpod-conmon-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope.
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:58:13 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6596b8fd2403688e1c2846251827eb8351db900fadb6ba717052caf8eb93089/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:58:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.304 265541 INFO neutron.agent.dhcp.agent [None req-ab0e1c05-a52a-4777-94ac-5fda81f8a7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:13 np0005626463.localdomain podman[315744]: 2026-02-23 09:58:13.305988817 +0000 UTC m=+0.131511895 container init 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:58:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.306 265541 INFO neutron.agent.dhcp.agent [None req-ab0e1c05-a52a-4777-94ac-5fda81f8a7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:13 np0005626463.localdomain podman[315744]: 2026-02-23 09:58:13.318835124 +0000 UTC m=+0.144358202 container start 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:58:13 np0005626463.localdomain podman[315744]: 2026-02-23 09:58:13.221458825 +0000 UTC m=+0.046981893 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:58:13 np0005626463.localdomain ceph-mon[294160]: osdmap e141: 6 total, 6 up, 6 in
Feb 23 09:58:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/249112054' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/249112054' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:13 np0005626463.localdomain dnsmasq[315769]: started, version 2.85 cachesize 150
Feb 23 09:58:13 np0005626463.localdomain dnsmasq[315769]: DNS service limited to local subnets
Feb 23 09:58:13 np0005626463.localdomain dnsmasq[315769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:58:13 np0005626463.localdomain dnsmasq[315769]: warning: no upstream servers configured
Feb 23 09:58:13 np0005626463.localdomain dnsmasq-dhcp[315769]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:58:13 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 0 addresses
Feb 23 09:58:13 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:13 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:58:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:58:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:58:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:58:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:58:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:58:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.451 265541 INFO neutron.agent.dhcp.agent [None req-81980b94-9ebd-460c-b985-f9233238ccc5 - - - - - -] DHCP configuration for ports {'59da67d1-dc5c-4e0b-8d29-1109ee1dfd79'} is completed
Feb 23 09:58:13 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:13.819 2 INFO neutron.agent.securitygroups_rpc [None req-eb522331-80d1-4b00-bd36-6fd8378962f5 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-72bec57338a9b4c2dc713f40ed6af99e9442fa24f4fd216de691c92203a3e570-merged.mount: Deactivated successfully.
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1-userdata-shm.mount: Deactivated successfully.
Feb 23 09:58:13 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dc843867f\x2d296c\x2d42fa\x2d9f8c\x2d55712f0f7c56.mount: Deactivated successfully.
Feb 23 09:58:14 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:14.207 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:14 np0005626463.localdomain ceph-mon[294160]: pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.1 KiB/s wr, 24 op/s
Feb 23 09:58:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:58:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:58:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:15 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:15.157 2 INFO neutron.agent.securitygroups_rpc [None req-4d44920c-4a63-4197-972a-c30d277ee529 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:15 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:15.161 2 INFO neutron.agent.securitygroups_rpc [None req-54919a2c-3a58-4a60-9867-0e5c23ba956a 8ff2abb777c74a6dbae4721d46f0d17a 182b0ebb06754cfab10ebabcdf7056ed - - default default] Security group member updated ['c029b069-aec5-44a4-9af0-e58cbf64895c']
Feb 23 09:58:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:15 np0005626463.localdomain ceph-mon[294160]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.1 KiB/s wr, 24 op/s
Feb 23 09:58:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:15.463 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:15Z, description=, device_id=2eb0a382-819f-43ad-8f50-a7e13025787c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291c11c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291c1250>], id=12133776-1eda-452b-a19f-0bf1cb92b9c1, ip_allocation=immediate, mac_address=fa:16:3e:f6:82:dc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=False, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2004, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:15Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a
Feb 23 09:58:15 np0005626463.localdomain systemd[1]: tmp-crun.28n2Ib.mount: Deactivated successfully.
Feb 23 09:58:15 np0005626463.localdomain podman[315787]: 2026-02-23 09:58:15.719822943 +0000 UTC m=+0.064598086 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:58:15 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses
Feb 23 09:58:15 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:15 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:15.901 265541 INFO neutron.agent.dhcp.agent [None req-2b0f6ab5-7875-4b16-821f-35040e07a6c9 - - - - - -] DHCP configuration for ports {'12133776-1eda-452b-a19f-0bf1cb92b9c1'} is completed
Feb 23 09:58:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3497166256' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:16 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3497166256' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:16.700 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:15Z, description=, device_id=2eb0a382-819f-43ad-8f50-a7e13025787c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829236910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292365e0>], id=12133776-1eda-452b-a19f-0bf1cb92b9c1, ip_allocation=immediate, mac_address=fa:16:3e:f6:82:dc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=False, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2004, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:15Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a
Feb 23 09:58:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:58:16 np0005626463.localdomain podman[315809]: 2026-02-23 09:58:16.909940903 +0000 UTC m=+0.086360159 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:16 np0005626463.localdomain podman[315809]: 2026-02-23 09:58:16.921329845 +0000 UTC m=+0.097749091 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 09:58:16 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:58:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:17 np0005626463.localdomain podman[315842]: 2026-02-23 09:58:17.054529539 +0000 UTC m=+0.058213509 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:58:17 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses
Feb 23 09:58:17 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:17 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:17 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:17.307 265541 INFO neutron.agent.dhcp.agent [None req-55a8c3e2-35cc-4d48-aa2d-26993f90f86d - - - - - -] DHCP configuration for ports {'12133776-1eda-452b-a19f-0bf1cb92b9c1'} is completed
Feb 23 09:58:17 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:17.716 2 INFO neutron.agent.securitygroups_rpc [None req-d1c62e78-6e68-4888-a717-f17406167923 1a9e25d9a0c746578e1b6c457935b6c2 983d362fe1064ddd8f80d65a731f1168 - - default default] Security group member updated ['011ab8d8-354c-4fb1-b0db-21af2eca313e']
Feb 23 09:58:17 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:17.734 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:17Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291c2e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291c2cd0>], id=de6b3f04-06ea-414f-8da1-27ddb41e69e1, ip_allocation=immediate, mac_address=fa:16:3e:42:93:58, name=tempest-FloatingIPNegativeTestJSON-1931896205, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['011ab8d8-354c-4fb1-b0db-21af2eca313e'], standard_attr_id=2021, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:17Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a
Feb 23 09:58:18 np0005626463.localdomain systemd[1]: tmp-crun.6T9ZWj.mount: Deactivated successfully.
Feb 23 09:58:18 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 2 addresses
Feb 23 09:58:18 np0005626463.localdomain podman[315880]: 2026-02-23 09:58:18.015974474 +0000 UTC m=+0.058933772 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:18 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:18 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:18 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:18.041 2 INFO neutron.agent.securitygroups_rpc [None req-45386671-0a66-4623-b814-6d3841258b3c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:18 np0005626463.localdomain ceph-mon[294160]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 3.9 KiB/s wr, 88 op/s
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 09:58:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:18.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:18 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:18.294 265541 INFO neutron.agent.dhcp.agent [None req-422fbba8-ec27-4586-80f8-2bb96cf2b3ca - - - - - -] DHCP configuration for ports {'de6b3f04-06ea-414f-8da1-27ddb41e69e1'} is completed
Feb 23 09:58:18 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:18.765 2 INFO neutron.agent.securitygroups_rpc [None req-ad743b42-9d5f-46bc-bf0d-b2f432d91b64 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:20 np0005626463.localdomain ceph-mon[294160]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 3.5 KiB/s wr, 79 op/s
Feb 23 09:58:20 np0005626463.localdomain sudo[315902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:58:20 np0005626463.localdomain sudo[315902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:58:20 np0005626463.localdomain sudo[315902]: pam_unix(sudo:session): session closed for user root
Feb 23 09:58:20 np0005626463.localdomain sudo[315920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:58:20 np0005626463.localdomain sudo[315920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:58:20 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:20.749 2 INFO neutron.agent.securitygroups_rpc [None req-e6124d39-875a-4fc5-8c30-4d7caf025748 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.158 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:58:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:21.158 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:58:21 np0005626463.localdomain sudo[315920]: pam_unix(sudo:session): session closed for user root
Feb 23 09:58:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:58:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:58:21 np0005626463.localdomain sudo[315971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:58:21 np0005626463.localdomain sudo[315971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:58:21 np0005626463.localdomain sudo[315971]: pam_unix(sudo:session): session closed for user root
Feb 23 09:58:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:21.577 2 INFO neutron.agent.securitygroups_rpc [None req-362b8c0d-6857-4758-a913-b5b9ee733cd1 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:21.618 2 INFO neutron.agent.securitygroups_rpc [None req-12654231-88d0-4565-b196-1dda145060e4 1a9e25d9a0c746578e1b6c457935b6c2 983d362fe1064ddd8f80d65a731f1168 - - default default] Security group member updated ['011ab8d8-354c-4fb1-b0db-21af2eca313e']
Feb 23 09:58:21 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses
Feb 23 09:58:21 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:21 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:21 np0005626463.localdomain podman[316005]: 2026-02-23 09:58:21.877732545 +0000 UTC m=+0.062749089 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:58:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:22.031 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 do_prune osdmap full prune enabled
Feb 23 09:58:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:22.053 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:58:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:22.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 e142: 6 total, 6 up, 6 in
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 2.5 KiB/s wr, 68 op/s
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:58:22 np0005626463.localdomain ceph-mon[294160]: osdmap e142: 6 total, 6 up, 6 in
Feb 23 09:58:22 np0005626463.localdomain systemd[1]: tmp-crun.sm2aXv.mount: Deactivated successfully.
Feb 23 09:58:22 np0005626463.localdomain dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 0 addresses
Feb 23 09:58:22 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host
Feb 23 09:58:22 np0005626463.localdomain dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts
Feb 23 09:58:22 np0005626463.localdomain podman[316042]: 2026-02-23 09:58:22.73629223 +0000 UTC m=+0.070489208 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:58:22 np0005626463.localdomain kernel: device tap591e24ae-b5 left promiscuous mode
Feb 23 09:58:22 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:22Z|00257|binding|INFO|Releasing lport 591e24ae-b5aa-4de2-bb90-f63788f17656 from this chassis (sb_readonly=0)
Feb 23 09:58:22 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:22Z|00258|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 down in Southbound
Feb 23 09:58:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:22.907 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:22.924 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '983d362fe1064ddd8f80d65a731f1168', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f1210f-d7d4-4dc7-9c80-9ac59d666074, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=591e24ae-b5aa-4de2-bb90-f63788f17656) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:22.926 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 591e24ae-b5aa-4de2-bb90-f63788f17656 in datapath 59b1761b-6159-4589-b0ab-d24692c6be4a unbound from our chassis
Feb 23 09:58:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:22.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:22.929 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59b1761b-6159-4589-b0ab-d24692c6be4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:58:22 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:22.930 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b41241db-4cfb-494a-9266-1717f1eba2ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:23.163 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:23 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:23.828 2 INFO neutron.agent.securitygroups_rpc [None req-843b8c0e-9c58-433a-b8a5-d142a0ae4b56 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:24.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:58:24 np0005626463.localdomain ceph-mon[294160]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 1.4 KiB/s wr, 51 op/s
Feb 23 09:58:24 np0005626463.localdomain systemd[1]: tmp-crun.UimNtR.mount: Deactivated successfully.
Feb 23 09:58:24 np0005626463.localdomain dnsmasq[315769]: exiting on receipt of SIGTERM
Feb 23 09:58:24 np0005626463.localdomain podman[316084]: 2026-02-23 09:58:24.345750626 +0000 UTC m=+0.068027182 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:58:24 np0005626463.localdomain systemd[1]: libpod-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope: Deactivated successfully.
Feb 23 09:58:24 np0005626463.localdomain podman[316099]: 2026-02-23 09:58:24.416587664 +0000 UTC m=+0.054883116 container died 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:58:24 np0005626463.localdomain podman[316099]: 2026-02-23 09:58:24.451342639 +0000 UTC m=+0.089638021 container cleanup 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:58:24 np0005626463.localdomain systemd[1]: libpod-conmon-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope: Deactivated successfully.
Feb 23 09:58:24 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:24.457 2 INFO neutron.agent.securitygroups_rpc [None req-f2783f1b-db17-4abb-b550-2a7eaeb7f1e9 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:24 np0005626463.localdomain podman[316100]: 2026-02-23 09:58:24.499441934 +0000 UTC m=+0.134076763 container remove 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:24.531 265541 INFO neutron.agent.dhcp.agent [None req-75265f1a-11dc-4535-ae70-e8314a9f3b26 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:24.749 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:25.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:25 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:25Z|00259|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:58:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:25.099 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2779787032' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/449675571' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/807534848' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3065101984' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:25 np0005626463.localdomain systemd[1]: tmp-crun.t2fHr6.mount: Deactivated successfully.
Feb 23 09:58:25 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a6596b8fd2403688e1c2846251827eb8351db900fadb6ba717052caf8eb93089-merged.mount: Deactivated successfully.
Feb 23 09:58:25 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1-userdata-shm.mount: Deactivated successfully.
Feb 23 09:58:25 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d59b1761b\x2d6159\x2d4589\x2db0ab\x2dd24692c6be4a.mount: Deactivated successfully.
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:58:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:58:26 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:26.247 2 INFO neutron.agent.securitygroups_rpc [None req-e81a47f4-2a39-4882-ae5e-f110dbf1c96f 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:26 np0005626463.localdomain ceph-mon[294160]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 800 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 1.4 KiB/s wr, 51 op/s
Feb 23 09:58:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:58:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:58:26 np0005626463.localdomain podman[316130]: 2026-02-23 09:58:26.905987156 +0000 UTC m=+0.077839456 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 09:58:26 np0005626463.localdomain podman[316130]: 2026-02-23 09:58:26.917487521 +0000 UTC m=+0.089339821 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:58:26 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:58:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:27 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:27.801 2 INFO neutron.agent.securitygroups_rpc [None req-99ddfa8e-0299-435b-a099-c8da64e3d700 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:58:28 np0005626463.localdomain ceph-mon[294160]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 801 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:58:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1889799132' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.584 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.646 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.646 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:58:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.840 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.842 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11359MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.842 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.843 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:58:28 np0005626463.localdomain podman[316175]: 2026-02-23 09:58:28.888865019 +0000 UTC m=+0.064684299 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347)
Feb 23 09:58:28 np0005626463.localdomain podman[316175]: 2026-02-23 09:58:28.902180409 +0000 UTC m=+0.077999719 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Feb 23 09:58:28 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.945 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.946 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:58:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:28.946 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.004 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:58:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1889799132' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:58:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/591754470' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.471 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.478 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.506 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.509 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.509 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:58:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:29.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:30 np0005626463.localdomain ceph-mon[294160]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 801 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:58:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/591754470' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:58:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:31.510 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:31.511 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 801 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:58:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:33.277 265541 INFO neutron.agent.linux.ip_lib [None req-d3ae1095-1940-42af-8546-1ec9cb187347 - - - - - -] Device tapd5a42e1b-50 cannot be used as it has no MAC address
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:33 np0005626463.localdomain kernel: device tapd5a42e1b-50 entered promiscuous mode
Feb 23 09:58:33 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840713.3132] manager: (tapd5a42e1b-50): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:33Z|00260|binding|INFO|Claiming lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 for this chassis.
Feb 23 09:58:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:33Z|00261|binding|INFO|d5a42e1b-5089-41c4-9d02-d28b44b515d2: Claiming unknown
Feb 23 09:58:33 np0005626463.localdomain systemd-udevd[316226]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:58:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:33.331 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0421515e6bb54dea8db3ed218999e195', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cbbaed5-c16c-4b6f-96d8-1ef1b1b430f5, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=d5a42e1b-5089-41c4-9d02-d28b44b515d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:33.335 163572 INFO neutron.agent.ovn.metadata.agent [-] Port d5a42e1b-5089-41c4-9d02-d28b44b515d2 in datapath ff7aa220-5765-44c6-9121-cfbd718241c5 bound to our chassis
Feb 23 09:58:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:33.338 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ff7aa220-5765-44c6-9121-cfbd718241c5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:33 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:33.339 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d22036d3-1127-4cea-8b56-defdc6f40853]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:33Z|00262|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 ovn-installed in OVS
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:33Z|00263|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 up in Southbound
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.356 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.389 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:33.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:34 np0005626463.localdomain podman[316297]: 
Feb 23 09:58:34 np0005626463.localdomain podman[316297]: 2026-02-23 09:58:34.237039273 +0000 UTC m=+0.093965075 container create e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 09:58:34 np0005626463.localdomain systemd[1]: Started libpod-conmon-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope.
Feb 23 09:58:34 np0005626463.localdomain podman[316297]: 2026-02-23 09:58:34.191409323 +0000 UTC m=+0.048335175 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:58:34 np0005626463.localdomain ceph-mon[294160]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 801 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:58:34 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:58:34 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a4908f0bc9328306b516915e2de85425e40798a526663541b1416ff04dc528a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:58:34 np0005626463.localdomain podman[316297]: 2026-02-23 09:58:34.315895009 +0000 UTC m=+0.172820821 container init e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 09:58:34 np0005626463.localdomain podman[316297]: 2026-02-23 09:58:34.32499541 +0000 UTC m=+0.181921212 container start e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 23 09:58:34 np0005626463.localdomain dnsmasq[316315]: started, version 2.85 cachesize 150
Feb 23 09:58:34 np0005626463.localdomain dnsmasq[316315]: DNS service limited to local subnets
Feb 23 09:58:34 np0005626463.localdomain dnsmasq[316315]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:58:34 np0005626463.localdomain dnsmasq[316315]: warning: no upstream servers configured
Feb 23 09:58:34 np0005626463.localdomain dnsmasq-dhcp[316315]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:58:34 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 0 addresses
Feb 23 09:58:34 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 09:58:34 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 09:58:34 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:34.559 265541 INFO neutron.agent.dhcp.agent [None req-47199348-c40d-463e-b620-75c60ffc1f48 - - - - - -] DHCP configuration for ports {'d908999e-0d8c-4805-bfe5-8996d9567d4b'} is completed
Feb 23 09:58:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:35.172 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3937103108' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3937103108' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:36 np0005626463.localdomain ceph-mon[294160]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 801 MiB used, 41 GiB / 42 GiB avail
Feb 23 09:58:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:58:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:58:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:37.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:37 np0005626463.localdomain ceph-mon[294160]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Feb 23 09:58:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:38.222 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:58:38 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:58:38 np0005626463.localdomain podman[316316]: 2026-02-23 09:58:38.916436244 +0000 UTC m=+0.089844467 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:38 np0005626463.localdomain systemd[1]: tmp-crun.Il6Uzl.mount: Deactivated successfully.
Feb 23 09:58:38 np0005626463.localdomain podman[316317]: 2026-02-23 09:58:38.989639376 +0000 UTC m=+0.157031123 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 09:58:39 np0005626463.localdomain podman[316317]: 2026-02-23 09:58:39.021681676 +0000 UTC m=+0.189073403 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:58:39 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:58:39 np0005626463.localdomain podman[316316]: 2026-02-23 09:58:39.079588574 +0000 UTC m=+0.252996737 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 09:58:39 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:58:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:58:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:58:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:58:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 23 09:58:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:58:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19284 "" "Go-http-client/1.1"
Feb 23 09:58:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:39.699 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:38Z, description=, device_id=c42435b0-1c04-411a-b921-da46209bd2fe, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ccdf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829c2cb80>], id=149dde58-b9e7-4644-99ab-581857e961de, ip_allocation=immediate, mac_address=fa:16:3e:86:80:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=False, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2185, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:39Z on network ff7aa220-5765-44c6-9121-cfbd718241c5
Feb 23 09:58:39 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses
Feb 23 09:58:39 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 09:58:39 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 09:58:39 np0005626463.localdomain podman[316381]: 2026-02-23 09:58:39.918929466 +0000 UTC m=+0.061896503 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 09:58:40 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:40.161 265541 INFO neutron.agent.dhcp.agent [None req-7ac5cd57-3586-4f5d-8634-63691d8f7931 - - - - - -] DHCP configuration for ports {'149dde58-b9e7-4644-99ab-581857e961de'} is completed
Feb 23 09:58:40 np0005626463.localdomain ceph-mon[294160]: pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Feb 23 09:58:40 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:40.903 265541 INFO neutron.agent.linux.ip_lib [None req-a0dde59b-31ed-4f7b-8c73-25dda12e803d - - - - - -] Device tap99eeaa57-51 cannot be used as it has no MAC address
Feb 23 09:58:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:40.927 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:40 np0005626463.localdomain kernel: device tap99eeaa57-51 entered promiscuous mode
Feb 23 09:58:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:40Z|00264|binding|INFO|Claiming lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 for this chassis.
Feb 23 09:58:40 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840720.9334] manager: (tap99eeaa57-51): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Feb 23 09:58:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:40Z|00265|binding|INFO|99eeaa57-5103-4a27-8cb6-f740d5ffcf80: Claiming unknown
Feb 23 09:58:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:40.933 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:40 np0005626463.localdomain systemd-udevd[316413]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:58:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:40.945 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a6c1ed33b7a401e921451e25668daed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8ac0cd8-62b1-44f3-b0a9-7a358af2ef4f, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=99eeaa57-5103-4a27-8cb6-f740d5ffcf80) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:40.947 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 in datapath fd854bec-4386-47ab-bc93-a08354b81ab6 bound to our chassis
Feb 23 09:58:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:40.949 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fd854bec-4386-47ab-bc93-a08354b81ab6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:40.951 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[33d2122f-10ea-42f5-a681-46aff8deb8f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:40Z|00266|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 ovn-installed in OVS
Feb 23 09:58:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:40Z|00267|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 up in Southbound
Feb 23 09:58:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:40.968 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:40 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:41 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:41 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap99eeaa57-51: No such device
Feb 23 09:58:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:41.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:41.039 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:41 np0005626463.localdomain podman[316484]: 
Feb 23 09:58:41 np0005626463.localdomain podman[316484]: 2026-02-23 09:58:41.954430084 +0000 UTC m=+0.095259224 container create 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:58:42 np0005626463.localdomain systemd[1]: Started libpod-conmon-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope.
Feb 23 09:58:42 np0005626463.localdomain podman[316484]: 2026-02-23 09:58:41.909546218 +0000 UTC m=+0.050375418 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:58:42 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:58:42 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b4d99b1f623479e08c1ba172833c2002e032a9da0c51ca920aa6a9e5f04cbc7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:58:42 np0005626463.localdomain podman[316484]: 2026-02-23 09:58:42.03552608 +0000 UTC m=+0.176355220 container init 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 09:58:42 np0005626463.localdomain podman[316484]: 2026-02-23 09:58:42.045752475 +0000 UTC m=+0.186581615 container start 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316503]: started, version 2.85 cachesize 150
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316503]: DNS service limited to local subnets
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316503]: warning: no upstream servers configured
Feb 23 09:58:42 np0005626463.localdomain dnsmasq-dhcp[316503]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 0 addresses
Feb 23 09:58:42 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host
Feb 23 09:58:42 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts
Feb 23 09:58:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:42 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:42.245 265541 INFO neutron.agent.dhcp.agent [None req-4a7eaf64-14f3-44eb-9e50-50827eb5357e - - - - - -] DHCP configuration for ports {'3e53d539-7a24-4706-8d58-d2ed9e960162'} is completed
Feb 23 09:58:42 np0005626463.localdomain ceph-mon[294160]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Feb 23 09:58:42 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:42.707 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:38Z, description=, device_id=c42435b0-1c04-411a-b921-da46209bd2fe, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ece50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ecbb0>], id=149dde58-b9e7-4644-99ab-581857e961de, ip_allocation=immediate, mac_address=fa:16:3e:86:80:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=False, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2185, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:39Z on network ff7aa220-5765-44c6-9121-cfbd718241c5
Feb 23 09:58:42 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses
Feb 23 09:58:42 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 09:58:42 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 09:58:42 np0005626463.localdomain podman[316521]: 2026-02-23 09:58:42.968815234 +0000 UTC m=+0.062412590 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 09:58:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:43.225 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:43.294 265541 INFO neutron.agent.dhcp.agent [None req-1e6d6e7f-5755-4b2f-a32d-639afa4ca577 - - - - - -] DHCP configuration for ports {'149dde58-b9e7-4644-99ab-581857e961de'} is completed
Feb 23 09:58:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:58:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:58:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:58:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:58:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:58:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:58:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:58:43 np0005626463.localdomain podman[316542]: 2026-02-23 09:58:43.919923638 +0000 UTC m=+0.087774172 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 09:58:43 np0005626463.localdomain podman[316542]: 2026-02-23 09:58:43.938431841 +0000 UTC m=+0.106282385 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 09:58:43 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:58:44 np0005626463.localdomain ceph-mon[294160]: pgmap v288: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Feb 23 09:58:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:45.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:46 np0005626463.localdomain ceph-mon[294160]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Feb 23 09:58:46 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:46.745 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:46Z, description=, device_id=c5c52445-4186-4d74-aba9-70f654d51933, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ec700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ecf10>], id=1a2c6d24-642f-408f-84d0-5d51e879d5dc, ip_allocation=immediate, mac_address=fa:16:3e:ab:19:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:36Z, description=, dns_domain=, id=fd854bec-4386-47ab-bc93-a08354b81ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-716275198-network, port_security_enabled=True, project_id=6a6c1ed33b7a401e921451e25668daed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59549, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['1cbee800-b3d4-414b-b23f-90c05ceb0493'], tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:39Z, vlan_transparent=None, network_id=fd854bec-4386-47ab-bc93-a08354b81ab6, port_security_enabled=False, project_id=6a6c1ed33b7a401e921451e25668daed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2232, status=DOWN, tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:46Z on network fd854bec-4386-47ab-bc93-a08354b81ab6
Feb 23 09:58:46 np0005626463.localdomain podman[316578]: 2026-02-23 09:58:46.994318224 +0000 UTC m=+0.064350689 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:58:46 np0005626463.localdomain dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 1 addresses
Feb 23 09:58:46 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host
Feb 23 09:58:46 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts
Feb 23 09:58:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:58:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:47 np0005626463.localdomain systemd[1]: tmp-crun.XTueAG.mount: Deactivated successfully.
Feb 23 09:58:47 np0005626463.localdomain podman[316593]: 2026-02-23 09:58:47.1233449 +0000 UTC m=+0.103726525 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:58:47 np0005626463.localdomain podman[316593]: 2026-02-23 09:58:47.156388522 +0000 UTC m=+0.136770177 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:47 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:58:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:47.271 265541 INFO neutron.agent.dhcp.agent [None req-0fe33b19-dda4-4592-a150-29072e1cc8d8 - - - - - -] DHCP configuration for ports {'1a2c6d24-642f-408f-84d0-5d51e879d5dc'} is completed
Feb 23 09:58:48 np0005626463.localdomain ceph-mon[294160]: pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Feb 23 09:58:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:48.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.293 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:46Z, description=, device_id=c5c52445-4186-4d74-aba9-70f654d51933, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937ca60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282937cfa0>], id=1a2c6d24-642f-408f-84d0-5d51e879d5dc, ip_allocation=immediate, mac_address=fa:16:3e:ab:19:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:36Z, description=, dns_domain=, id=fd854bec-4386-47ab-bc93-a08354b81ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-716275198-network, port_security_enabled=True, project_id=6a6c1ed33b7a401e921451e25668daed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59549, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['1cbee800-b3d4-414b-b23f-90c05ceb0493'], tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:39Z, vlan_transparent=None, network_id=fd854bec-4386-47ab-bc93-a08354b81ab6, port_security_enabled=False, project_id=6a6c1ed33b7a401e921451e25668daed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2232, status=DOWN, tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:46Z on network fd854bec-4386-47ab-bc93-a08354b81ab6
Feb 23 09:58:48 np0005626463.localdomain dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 1 addresses
Feb 23 09:58:48 np0005626463.localdomain podman[316636]: 2026-02-23 09:58:48.531442974 +0000 UTC m=+0.060927133 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:58:48 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host
Feb 23 09:58:48 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:58:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.775 265541 INFO neutron.agent.dhcp.agent [None req-501debf8-4dac-46d2-a51e-8d2330460ac6 - - - - - -] DHCP configuration for ports {'1a2c6d24-642f-408f-84d0-5d51e879d5dc'} is completed
Feb 23 09:58:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.888 265541 INFO neutron.agent.linux.ip_lib [None req-0693f31f-1e33-4e13-93a5-4bf59a810c38 - - - - - -] Device tap244c57bd-d2 cannot be used as it has no MAC address
Feb 23 09:58:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:48.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:48 np0005626463.localdomain kernel: device tap244c57bd-d2 entered promiscuous mode
Feb 23 09:58:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:48Z|00268|binding|INFO|Claiming lport 244c57bd-d22b-4733-8be3-0ce88383151e for this chassis.
Feb 23 09:58:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:48Z|00269|binding|INFO|244c57bd-d22b-4733-8be3-0ce88383151e: Claiming unknown
Feb 23 09:58:48 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840728.9207] manager: (tap244c57bd-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Feb 23 09:58:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:48.924 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:48 np0005626463.localdomain systemd-udevd[316668]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.940 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f70b6503-965d-4883-bf00-ea5f7a873818, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=244c57bd-d22b-4733-8be3-0ce88383151e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.942 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 244c57bd-d22b-4733-8be3-0ce88383151e in datapath 169a9bd5-a623-4b9d-83a6-bf3f6708a358 bound to our chassis
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.945 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 169a9bd5-a623-4b9d-83a6-bf3f6708a358 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:58:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:48.945 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8d18cb-975c-4bf5-a185-bfb454bb6e98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:48Z|00270|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e ovn-installed in OVS
Feb 23 09:58:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:48Z|00271|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e up in Southbound
Feb 23 09:58:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:48.957 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:48.958 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:48 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap244c57bd-d2: No such device
Feb 23 09:58:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:49.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:49.039 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:50 np0005626463.localdomain podman[316739]: 
Feb 23 09:58:50 np0005626463.localdomain podman[316739]: 2026-02-23 09:58:50.12474214 +0000 UTC m=+0.088238017 container create 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:58:50 np0005626463.localdomain systemd[1]: Started libpod-conmon-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope.
Feb 23 09:58:50 np0005626463.localdomain podman[316739]: 2026-02-23 09:58:50.081994649 +0000 UTC m=+0.045490586 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:58:50 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:58:50 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74cfcad24671a33e4fe137c8b775f0c8c7b0e5f093f9846bc6f205eac1c88d45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:58:50 np0005626463.localdomain podman[316739]: 2026-02-23 09:58:50.209728586 +0000 UTC m=+0.173224463 container init 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:58:50 np0005626463.localdomain podman[316739]: 2026-02-23 09:58:50.219032253 +0000 UTC m=+0.182528130 container start 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:50 np0005626463.localdomain dnsmasq[316758]: started, version 2.85 cachesize 150
Feb 23 09:58:50 np0005626463.localdomain dnsmasq[316758]: DNS service limited to local subnets
Feb 23 09:58:50 np0005626463.localdomain dnsmasq[316758]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:58:50 np0005626463.localdomain dnsmasq[316758]: warning: no upstream servers configured
Feb 23 09:58:50 np0005626463.localdomain dnsmasq-dhcp[316758]: DHCP, static leases only on 10.103.0.0, lease time 1d
Feb 23 09:58:50 np0005626463.localdomain dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 0 addresses
Feb 23 09:58:50 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host
Feb 23 09:58:50 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts
Feb 23 09:58:50 np0005626463.localdomain ceph-mon[294160]: pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 9.5 KiB/s rd, 255 B/s wr, 12 op/s
Feb 23 09:58:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:50.425 265541 INFO neutron.agent.dhcp.agent [None req-346905fc-8667-42f4-96f4-e0f0e63b898a - - - - - -] DHCP configuration for ports {'d5e8428f-bde0-41bf-9c99-6da36b6352c2'} is completed
Feb 23 09:58:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1412528347' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1412528347' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:51.391 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:51.393 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:58:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:51.397 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:58:51 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:51.398 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[742bb80e-ef74-42c4-ad92-d46f92e656fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:51 np0005626463.localdomain sshd[316759]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:58:51 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:51.457 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:50Z, description=, device_id=cfa27e09-aa47-4679-8e70-2bef8c0fc3b1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282936cd90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282936c730>], id=da6ff8ec-6264-4847-beac-d4de76d2b819, ip_allocation=immediate, mac_address=fa:16:3e:fd:a2:ba, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:45Z, description=, dns_domain=, id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315899538, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65364, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2230, status=ACTIVE, subnets=['3a493491-cb34-442e-b652-e18d508e6e1e'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:47Z, vlan_transparent=None, network_id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2249, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:51Z on network 169a9bd5-a623-4b9d-83a6-bf3f6708a358
Feb 23 09:58:51 np0005626463.localdomain dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 1 addresses
Feb 23 09:58:51 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host
Feb 23 09:58:51 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts
Feb 23 09:58:51 np0005626463.localdomain podman[316778]: 2026-02-23 09:58:51.704758165 +0000 UTC m=+0.057411844 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 09:58:51 np0005626463.localdomain sshd[316759]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:58:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:52.006 265541 INFO neutron.agent.dhcp.agent [None req-7a20fdea-f7d6-4b00-9145-d1af8673ce12 - - - - - -] DHCP configuration for ports {'da6ff8ec-6264-4847-beac-d4de76d2b819'} is completed
Feb 23 09:58:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:52 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:52.187 2 INFO neutron.agent.securitygroups_rpc [None req-a57c617c-c4ee-4d55-b7f0-53311658d2fd 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:52.418 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:52.420 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:58:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:52.421 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:58:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:52.461 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:52 np0005626463.localdomain ceph-mon[294160]: pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 767 B/s wr, 27 op/s
Feb 23 09:58:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:53.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: pgmap v293: 177 pgs: 177 active+clean; 145 MiB data, 805 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 767 B/s wr, 28 op/s
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:53 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:53.559 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:50Z, description=, device_id=cfa27e09-aa47-4679-8e70-2bef8c0fc3b1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829321be0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829247a30>], id=da6ff8ec-6264-4847-beac-d4de76d2b819, ip_allocation=immediate, mac_address=fa:16:3e:fd:a2:ba, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:45Z, description=, dns_domain=, id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315899538, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65364, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2230, status=ACTIVE, subnets=['3a493491-cb34-442e-b652-e18d508e6e1e'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:47Z, vlan_transparent=None, network_id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2249, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:51Z on network 169a9bd5-a623-4b9d-83a6-bf3f6708a358
Feb 23 09:58:53 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:53.586 2 INFO neutron.agent.securitygroups_rpc [None req-bc2e33ea-0cb8-4312-bf00-811550151f9a 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:53 np0005626463.localdomain dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 1 addresses
Feb 23 09:58:53 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host
Feb 23 09:58:53 np0005626463.localdomain podman[316816]: 2026-02-23 09:58:53.8139679 +0000 UTC m=+0.060674606 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:58:53 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts
Feb 23 09:58:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:54.061 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:54 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:54.105 265541 INFO neutron.agent.dhcp.agent [None req-feeb937f-ece0-4b1b-91b1-cfc429dec68b - - - - - -] DHCP configuration for ports {'da6ff8ec-6264-4847-beac-d4de76d2b819'} is completed
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.151 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.157 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f984c4e8-b3ca-42e5-8ac6-b8996d34ad92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.152826', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4447247c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'dca78750f935e2471be500abbc7563f127c6bbd723446997d7ca144886e02515'}]}, 'timestamp': '2026-02-23 09:58:56.159746', '_unique_id': '5100e6f474ad43f3b3a5e198e98deeeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5e62cdf-6e36-44d4-97d3-dc3e9ea97823', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.164549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444cf974-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '04e7249fe5a9f96a8dc29d943288b122786d01b424b40410fde1e035804401e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.164549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444d0752-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'ad8235436fb1eff9f50584497938b6d5e2d6168c1fae94f46e0a881add380c68'}]}, 'timestamp': '2026-02-23 09:58:56.197344', '_unique_id': 'd1c70a1bbfa6463c8ad09f6a36a4162c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a07e759-3c80-4b14-8917-bf2010a02a0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.199390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444d6292-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '3e7849fb1959627cfb6c572a705e5bfb8f59187eed237f933f8edb84413e31bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.199390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444d7494-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'ef568f22599a855d569fdbc0b6de88f53c0760314a6769b67d524ccb73cebbde'}]}, 'timestamp': '2026-02-23 09:58:56.200131', '_unique_id': 'dbe2215518d54ac68725b66ac163aa4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c812296c-6285-4143-aeac-fcbb87e11a9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.201663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444dbac6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'b170db6bdaaedca6c9e449a77091700248bbe32634bed7583591170607a4b986'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.201663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444dcc46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '37c4fffb19a085e76f5ec3fa1d0d7a44b139704b774ecc86cf44b36f0b099ef4'}]}, 'timestamp': '2026-02-23 09:58:56.202373', '_unique_id': '36a198a83de04422815e4c2cb093c8d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60d31a62-345d-4652-b9c7-f1362b3219da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.203809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444e0f9e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'dd960f0e86c00c7f38afd7a34df990a52122544dbb0ac8ed617e657f66cbcc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.203809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444e2088-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'a7111064de063ed32ea9ad503d9866291040d0d733b291f9445bdd679bf4d283'}]}, 'timestamp': '2026-02-23 09:58:56.204533', '_unique_id': '25cb8dbcfc0441fba226fdb89d37fcdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6273096-642d-438f-9ac0-5ac5364c33af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.206249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444e6e30-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'dfc45d44a4fe631826f503e3b0814017d1db605c601a3c0741f6a1f2efe1d652'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.206249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444e7830-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '9d136adf2d0338743bd4c16baa78ec5017129f6c592876f1eab91aaeeb704ee6'}]}, 'timestamp': '2026-02-23 09:58:56.206970', '_unique_id': '4e6ed5ed46d34b7db834f2df7f5a53f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb7599b7-43b3-4b93-ac75-35d7fe95b782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.208703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44509476-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b3e0a67b4b17b16e7d326d34e614c1e7829906751495dadf66369b6a4bdce73b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.208703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44509f16-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'bd47571e80eb4c6a25deb9d80fb05ad701f75f1d022b906c596cf46b71d398e3'}]}, 'timestamp': '2026-02-23 09:58:56.220941', '_unique_id': '8ddfe8c72a884241a13dc6146633c451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44704524-4428-4d86-a7f8-c969e9b8ffab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.222362', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4450e3ae-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '5f8bb649c3d2181118df065add0bff854f0211e9cc911ba9d4c3123c735e2611'}]}, 'timestamp': '2026-02-23 09:58:56.222651', '_unique_id': '73186567fc4e4711896b76e58b195889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd605ae7-2f80-4a2c-a64e-c43f8dd48a14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.224038', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '445124fe-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '13c47d2ad3b01937c3b38c69ffc91372cbdfe52772454433092939f44057ca87'}]}, 'timestamp': '2026-02-23 09:58:56.224374', '_unique_id': 'aecccd5384aa4a328f14fab06c600c50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f01f88e-f48a-4dff-ab13-54e0f36a61af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.225784', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44516a4a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '93129dec73ebf183a43e8da223a73d08304d86b5c490667f9460f404093c01ee'}]}, 'timestamp': '2026-02-23 09:58:56.226098', '_unique_id': '7c1dfbd0203947abb36d3cfa1944811c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f19c06b-8a02-4f4b-91da-d4d8c8877bd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.227594', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4451b0ea-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '6ac4c5b4cc85b88d1465fe9e6c4408830302cae4bdab036612e08a03a2605060'}]}, 'timestamp': '2026-02-23 09:58:56.227932', '_unique_id': '0f92354c54c8465ca834f08799568ebb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74f8a78c-0287-4712-9463-7e031de5226c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.229343', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4451f5dc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '1d7c9f25b611ad00eea6f34a1a160fafb36599d2baa381f2b6fc30a1cea5e8d8'}]}, 'timestamp': '2026-02-23 09:58:56.229673', '_unique_id': '470dad61c532430381306ffff8a9b98f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50463f94-82d3-4e14-8347-38a427e157e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.231261', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44523efc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'd4824f254686d541fe018324a3bc4ac5e8533ec6901c5d08f4c61c649a46cc0f'}]}, 'timestamp': '2026-02-23 09:58:56.231644', '_unique_id': 'c428a30b27e7442780a2aef7767f27a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdee3c1d-81ed-46e5-8ee6-e87113edb262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.233155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44528a38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'f24e208638724fcd4d8ef784e95e41b088d31b6a25efded163bcdc8f68c8d655'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.233155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44529582-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '9e98305f6bfdf473645264618e4db59be256606d6eb8f0276a8ed69eff059f77'}]}, 'timestamp': '2026-02-23 09:58:56.233738', '_unique_id': '21b9477fdcf84556b69b8637ae90a551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.235 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd71e00df-a4df-4800-83df-4d4d2a615bfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.236052', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4452fb1c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'f2ec4afeb1c8ff9f29dabf6a6a1360fa73276d9dae1980724de598301a577e83'}]}, 'timestamp': '2026-02-23 09:58:56.236360', '_unique_id': '90018c81ec22449792b0ebfd63adadd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 14280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f88f5f-efc8-4705-855b-ccf31cbac0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14280000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:58:56.237692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '44569ef2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.449263496, 'message_signature': '71ba07e11bc4b343f0a50627bc8a10a6deea2b0ff95e80bf920b546dfae0f576'}]}, 'timestamp': '2026-02-23 09:58:56.260211', '_unique_id': 'edd077817958429d8fe3af0d1409b73a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5e1fbc8-cb28-4018-9f09-328b20697ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.261803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4456e902-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'cbee7d8f15be757cf6baae76c6345938764a47eb7ae146583d75edb27003d67d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.261803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4456f5be-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b412034e9683c124ec6715ecc607792b13e92822b2c645eae244a1f859fa985d'}]}, 'timestamp': '2026-02-23 09:58:56.262452', '_unique_id': 'fd77681d6ae545f7b53a5ca68f2f9855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.264 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fea9d0e-5a81-4731-861a-77e9c1c14673', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.264156', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44574762-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '6c6ac233e447a174253df767b9808abbd00d232946c64894b6e5cd8d1228f313'}]}, 'timestamp': '2026-02-23 09:58:56.264556', '_unique_id': 'c63290e0355b43db8bc61c9efa1ce1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a7e0b21-5c37-4ef4-8856-816ae61dc4a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.266111', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '445790c8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '18cfe4da80f67ffeb405dfdaf18aa7c9dab651d1a011a0baa817769634dba447'}]}, 'timestamp': '2026-02-23 09:58:56.266404', '_unique_id': 'b7f2fc1e5f354320a5ea72d701d496bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceph-mon[294160]: pgmap v294: 177 pgs: 177 active+clean; 145 MiB data, 805 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 767 B/s wr, 28 op/s
Feb 23 09:58:56 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3383097945' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:56 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3383097945' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67ca4aac-0ea6-4ee9-b638-0e860a24c0a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:58:56.268005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4457dc4a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.449263496, 'message_signature': 'a62342408bdf7b01d8c1b6e22d2b419bc1e51d9fb1ff5599bee0ae6b43b969ea'}]}, 'timestamp': '2026-02-23 09:58:56.268352', '_unique_id': '5fa038e51b5c4167a0b33acf73b2dcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c01e31-ad01-4a3a-9784-b39717a61af9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.269850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44582330-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': '2b09d88f0b15d39696557d48cb0a9ac7a3114d9a148f284006ff2b53e4d12115'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.269850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44582f1a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b1b5e63ea4f8a536274ca6541a7c6728e8e6bd99cd1e4febfc0526c4ff996175'}]}, 'timestamp': '2026-02-23 09:58:56.270438', '_unique_id': '7e3a38aaf9d149a08f6024be2d5f9e94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 09:58:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 09:58:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:58:57 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:57.182 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:57 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:57.184 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:58:57 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:57.188 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:58:57 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:57.189 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[142ca4f5-6cbc-40b8-85da-d8290640d8c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:58:57 np0005626463.localdomain systemd[1]: tmp-crun.25ZlgV.mount: Deactivated successfully.
Feb 23 09:58:57 np0005626463.localdomain dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 0 addresses
Feb 23 09:58:57 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host
Feb 23 09:58:57 np0005626463.localdomain podman[316865]: 2026-02-23 09:58:57.939801769 +0000 UTC m=+0.065122822 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 09:58:57 np0005626463.localdomain dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts
Feb 23 09:58:57 np0005626463.localdomain podman[316849]: 2026-02-23 09:58:57.927164429 +0000 UTC m=+0.100029021 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:58:58 np0005626463.localdomain podman[316849]: 2026-02-23 09:58:58.006252313 +0000 UTC m=+0.179116885 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 09:58:58 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:58:58 np0005626463.localdomain ceph-mon[294160]: pgmap v295: 177 pgs: 177 active+clean; 201 MiB data, 834 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 1.9 MiB/s wr, 99 op/s
Feb 23 09:58:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:58.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:58Z|00272|binding|INFO|Releasing lport 244c57bd-d22b-4733-8be3-0ce88383151e from this chassis (sb_readonly=0)
Feb 23 09:58:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:58Z|00273|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e down in Southbound
Feb 23 09:58:58 np0005626463.localdomain kernel: device tap244c57bd-d2 left promiscuous mode
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.155 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f70b6503-965d-4883-bf00-ea5f7a873818, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=244c57bd-d22b-4733-8be3-0ce88383151e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.157 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 244c57bd-d22b-4733-8be3-0ce88383151e in datapath 169a9bd5-a623-4b9d-83a6-bf3f6708a358 unbound from our chassis
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.161 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 169a9bd5-a623-4b9d-83a6-bf3f6708a358, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.162 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0be8fc78-4d4d-46df-9a6f-b9f6d1dd0afd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:58.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:58.167 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.207 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.209 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.213 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:58:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:58:58.214 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5a91fe0c-b703-4a1b-b234-6283153cd7d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:58:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:58.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:58:58.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:58:58 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:58:58.833 2 INFO neutron.agent.securitygroups_rpc [None req-d199110d-6878-4918-921e-08b8a8f76fc7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:58:59 np0005626463.localdomain dnsmasq[316758]: exiting on receipt of SIGTERM
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: libpod-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain podman[316912]: 2026-02-23 09:58:59.351780273 +0000 UTC m=+0.421054249 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:58:59 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2980215869' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:58:59 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2980215869' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:58:59 np0005626463.localdomain podman[316937]: 2026-02-23 09:58:59.4487772 +0000 UTC m=+0.066445494 container died 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:58:59 np0005626463.localdomain podman[316924]: 2026-02-23 09:58:59.448679326 +0000 UTC m=+0.083191010 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: tmp-crun.PXW1yu.mount: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain podman[316937]: 2026-02-23 09:58:59.558679436 +0000 UTC m=+0.176347690 container remove 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: libpod-conmon-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain podman[316924]: 2026-02-23 09:58:59.578632792 +0000 UTC m=+0.213144426 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:59.683 265541 INFO neutron.agent.dhcp.agent [None req-75f5ce48-1363-4120-bad5-08d52bde641d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:58:59.684 265541 INFO neutron.agent.dhcp.agent [None req-75f5ce48-1363-4120-bad5-08d52bde641d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:58:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:58:59Z|00274|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-74cfcad24671a33e4fe137c8b775f0c8c7b0e5f093f9846bc6f205eac1c88d45-merged.mount: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576-userdata-shm.mount: Deactivated successfully.
Feb 23 09:58:59 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d169a9bd5\x2da623\x2d4b9d\x2d83a6\x2dbf3f6708a358.mount: Deactivated successfully.
Feb 23 09:59:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:00.077 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:00 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:00.147 2 INFO neutron.agent.securitygroups_rpc [None req-378c2f4f-1435-4dd5-9061-db29c2eb645b 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:00 np0005626463.localdomain ceph-mon[294160]: pgmap v296: 177 pgs: 177 active+clean; 201 MiB data, 834 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 1.9 MiB/s wr, 99 op/s
Feb 23 09:59:00 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1476094254' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:00 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1476094254' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 09:59:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1474432843' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 do_prune osdmap full prune enabled
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 e143: 6 total, 6 up, 6 in
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in
Feb 23 09:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:59:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2868 syncs, 3.58 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4956 writes, 16K keys, 4956 commit groups, 1.0 writes per commit group, ingest: 15.78 MB, 0.03 MB/s
                                                          Interval WAL: 4955 writes, 2127 syncs, 2.33 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1474432843' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:01 np0005626463.localdomain ceph-mon[294160]: pgmap v297: 177 pgs: 177 active+clean; 218 MiB data, 863 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 3.5 MiB/s wr, 159 op/s
Feb 23 09:59:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 do_prune osdmap full prune enabled
Feb 23 09:59:02 np0005626463.localdomain ceph-mon[294160]: osdmap e143: 6 total, 6 up, 6 in
Feb 23 09:59:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e144 e144: 6 total, 6 up, 6 in
Feb 23 09:59:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in
Feb 23 09:59:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:02.898 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:02.900 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:02.904 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:02.905 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c648f7d3-d8aa-46c7-bac4-0de58bc6c14d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:03.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e144 do_prune osdmap full prune enabled
Feb 23 09:59:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e145 e145: 6 total, 6 up, 6 in
Feb 23 09:59:03 np0005626463.localdomain ceph-mon[294160]: osdmap e144: 6 total, 6 up, 6 in
Feb 23 09:59:03 np0005626463.localdomain ceph-mon[294160]: pgmap v300: 177 pgs: 177 active+clean; 192 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 147 KiB/s rd, 5.3 MiB/s wr, 216 op/s
Feb 23 09:59:03 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e145 do_prune osdmap full prune enabled
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: osdmap e145: 6 total, 6 up, 6 in
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3290866913' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3290866913' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3240796382' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3240796382' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 e146: 6 total, 6 up, 6 in
Feb 23 09:59:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in
Feb 23 09:59:05 np0005626463.localdomain ceph-mon[294160]: osdmap e146: 6 total, 6 up, 6 in
Feb 23 09:59:05 np0005626463.localdomain ceph-mon[294160]: pgmap v303: 177 pgs: 177 active+clean; 192 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 511 B/s wr, 39 op/s
Feb 23 09:59:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:05.511 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:05.513 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:05.516 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:05 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:05.516 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1f29e7fe-73ae-4b1e-9cf3-2335d3a5dd7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 09:59:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2814 syncs, 3.67 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4763 writes, 16K keys, 4763 commit groups, 1.0 writes per commit group, ingest: 13.99 MB, 0.02 MB/s
                                                          Interval WAL: 4763 writes, 2036 syncs, 2.34 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 09:59:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:06.660 2 INFO neutron.agent.securitygroups_rpc [None req-662ed715-f0df-4027-a61e-4a9b758296d7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:06 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2170764385' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:07.921 2 INFO neutron.agent.securitygroups_rpc [None req-8305faf4-3c52-460c-b047-e282e2502f1e 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 do_prune osdmap full prune enabled
Feb 23 09:59:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e147 e147: 6 total, 6 up, 6 in
Feb 23 09:59:08 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in
Feb 23 09:59:08 np0005626463.localdomain ceph-mon[294160]: pgmap v304: 177 pgs: 177 active+clean; 238 MiB data, 936 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 3.7 MiB/s wr, 119 op/s
Feb 23 09:59:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:08.284 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:08Z|00275|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:59:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:08.902 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:09.077 2 INFO neutron.agent.securitygroups_rpc [None req-0b1bb1a0-ad29-4db2-a82b-e96c70807da2 9903926d083041b9a33881e7cab5b89f c0dc7447f79a422a9af7dbd04780afa6 - - default default] Security group member updated ['abb3c63b-8b38-4dd7-99e4-d8f07472a5d2']
Feb 23 09:59:09 np0005626463.localdomain ceph-mon[294160]: osdmap e147: 6 total, 6 up, 6 in
Feb 23 09:59:09 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/159217426' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:09 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/159217426' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:09 np0005626463.localdomain podman[242954]: time="2026-02-23T09:59:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:59:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:59:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1"
Feb 23 09:59:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:59:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1"
Feb 23 09:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:59:09 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:59:09 np0005626463.localdomain systemd[1]: tmp-crun.DBMul9.mount: Deactivated successfully.
Feb 23 09:59:09 np0005626463.localdomain podman[316973]: 2026-02-23 09:59:09.921986465 +0000 UTC m=+0.087284627 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 09:59:09 np0005626463.localdomain podman[316973]: 2026-02-23 09:59:09.961601879 +0000 UTC m=+0.126900021 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 09:59:09 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:59:09 np0005626463.localdomain podman[316972]: 2026-02-23 09:59:09.974743625 +0000 UTC m=+0.143549725 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true)
Feb 23 09:59:10 np0005626463.localdomain podman[316972]: 2026-02-23 09:59:10.015335809 +0000 UTC m=+0.184141899 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 09:59:10 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e147 do_prune osdmap full prune enabled
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: pgmap v306: 177 pgs: 177 active+clean; 238 MiB data, 936 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.5 MiB/s wr, 88 op/s
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1774236521' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1774236521' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 e148: 6 total, 6 up, 6 in
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:59:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:11 np0005626463.localdomain ceph-mon[294160]: osdmap e148: 6 total, 6 up, 6 in
Feb 23 09:59:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:11 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.304 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.306 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.309 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.310 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ba46cd-3260-496a-b1a2-68d3894fe5d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:11 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:11.517 2 INFO neutron.agent.securitygroups_rpc [None req-3920dd90-663f-4a5e-863e-875f00aeb78d 9903926d083041b9a33881e7cab5b89f c0dc7447f79a422a9af7dbd04780afa6 - - default default] Security group member updated ['abb3c63b-8b38-4dd7-99e4-d8f07472a5d2']
Feb 23 09:59:11 np0005626463.localdomain dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 0 addresses
Feb 23 09:59:11 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host
Feb 23 09:59:11 np0005626463.localdomain dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts
Feb 23 09:59:11 np0005626463.localdomain podman[317033]: 2026-02-23 09:59:11.687646466 +0000 UTC m=+0.067650921 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:11Z|00276|binding|INFO|Releasing lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 from this chassis (sb_readonly=0)
Feb 23 09:59:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:11 np0005626463.localdomain kernel: device tap99eeaa57-51 left promiscuous mode
Feb 23 09:59:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:11Z|00277|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 down in Southbound
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.896 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a6c1ed33b7a401e921451e25668daed', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8ac0cd8-62b1-44f3-b0a9-7a358af2ef4f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=99eeaa57-5103-4a27-8cb6-f740d5ffcf80) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.898 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 in datapath fd854bec-4386-47ab-bc93-a08354b81ab6 unbound from our chassis
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.900 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd854bec-4386-47ab-bc93-a08354b81ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:11.901 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5589ff04-f058-4543-8779-3ea97bfbdf0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:11.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 do_prune osdmap full prune enabled
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e149 e149: 6 total, 6 up, 6 in
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: pgmap v308: 177 pgs: 177 active+clean; 238 MiB data, 966 MiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 5.9 MiB/s wr, 147 op/s
Feb 23 09:59:12 np0005626463.localdomain ceph-mon[294160]: osdmap e149: 6 total, 6 up, 6 in
Feb 23 09:59:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2607662654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2607662654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:13.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:59:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:59:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:59:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:59:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:59:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:59:13 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:13.911 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:13 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:13.913 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:13 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:13.916 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:13 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:13.917 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[096eddeb-8fa9-4939-82f3-9cd7aa53b089]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e149 do_prune osdmap full prune enabled
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 e150: 6 total, 6 up, 6 in
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: pgmap v310: 177 pgs: 177 active+clean; 238 MiB data, 972 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 3.5 MiB/s wr, 187 op/s
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in
Feb 23 09:59:14 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:14Z|00278|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:59:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:59:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:14.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:59:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:14 np0005626463.localdomain systemd[1]: tmp-crun.aiE9hR.mount: Deactivated successfully.
Feb 23 09:59:14 np0005626463.localdomain podman[317055]: 2026-02-23 09:59:14.956110478 +0000 UTC m=+0.128479031 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 09:59:14 np0005626463.localdomain podman[317055]: 2026-02-23 09:59:14.994293298 +0000 UTC m=+0.166661801 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:59:15 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:15.024 2 INFO neutron.agent.securitygroups_rpc [None req-785e188e-0451-4720-a843-201f1ea322a4 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:15 np0005626463.localdomain ceph-mon[294160]: osdmap e150: 6 total, 6 up, 6 in
Feb 23 09:59:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:15 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:15 np0005626463.localdomain dnsmasq[316503]: exiting on receipt of SIGTERM
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: libpod-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope: Deactivated successfully.
Feb 23 09:59:15 np0005626463.localdomain podman[317089]: 2026-02-23 09:59:15.687995129 +0000 UTC m=+0.059136958 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 09:59:15 np0005626463.localdomain podman[317102]: 2026-02-23 09:59:15.76018631 +0000 UTC m=+0.058969713 container died 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:59:15 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:15.776 2 INFO neutron.agent.securitygroups_rpc [None req-abe579f6-947b-47f0-ad4d-2e1c9d13dcd9 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:15 np0005626463.localdomain podman[317102]: 2026-02-23 09:59:15.787792593 +0000 UTC m=+0.086575956 container cleanup 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: libpod-conmon-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope: Deactivated successfully.
Feb 23 09:59:15 np0005626463.localdomain podman[317104]: 2026-02-23 09:59:15.830854983 +0000 UTC m=+0.122507925 container remove 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS)
Feb 23 09:59:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:15.854 265541 INFO neutron.agent.dhcp.agent [None req-43cafa2d-f814-489f-ac45-1179456daee8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-4b4d99b1f623479e08c1ba172833c2002e032a9da0c51ca920aa6a9e5f04cbc7-merged.mount: Deactivated successfully.
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137-userdata-shm.mount: Deactivated successfully.
Feb 23 09:59:15 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dfd854bec\x2d4386\x2d47ab\x2dbc93\x2da08354b81ab6.mount: Deactivated successfully.
Feb 23 09:59:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:16.021 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:16 np0005626463.localdomain ceph-mon[294160]: pgmap v312: 177 pgs: 177 active+clean; 238 MiB data, 972 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 3.5 MiB/s wr, 187 op/s
Feb 23 09:59:16 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:16.560 2 INFO neutron.agent.securitygroups_rpc [None req-b25a55ad-3164-4d9c-85c0-7cab33b9b16d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group rule updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']
Feb 23 09:59:16 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:16.816 2 INFO neutron.agent.securitygroups_rpc [None req-e4424286-7161-4a51-a79b-4dabbb149f4e f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group rule updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']
Feb 23 09:59:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 do_prune osdmap full prune enabled
Feb 23 09:59:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 e151: 6 total, 6 up, 6 in
Feb 23 09:59:17 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in
Feb 23 09:59:17 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:59:17 np0005626463.localdomain podman[317133]: 2026-02-23 09:59:17.905556992 +0000 UTC m=+0.077989781 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 09:59:17 np0005626463.localdomain podman[317133]: 2026-02-23 09:59:17.939292254 +0000 UTC m=+0.111725043 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:59:17 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:59:18 np0005626463.localdomain ceph-mon[294160]: osdmap e151: 6 total, 6 up, 6 in
Feb 23 09:59:18 np0005626463.localdomain ceph-mon[294160]: pgmap v314: 177 pgs: 177 active+clean; 145 MiB data, 823 MiB used, 41 GiB / 42 GiB avail; 1.6 MiB/s rd, 478 KiB/s wr, 207 op/s
Feb 23 09:59:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:18.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:19 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:19.271 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:19 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:19.273 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:19 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:19.277 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:19 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:19.278 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3be2bed1-9ca7-4263-b058-83e3eec06cbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:20 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:20.246 2 INFO neutron.agent.securitygroups_rpc [None req-22dd3c2a-2988-4b13-8ce0-dbc57aa028bb 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:20 np0005626463.localdomain ceph-mon[294160]: pgmap v315: 177 pgs: 177 active+clean; 145 MiB data, 823 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 4.6 KiB/s wr, 80 op/s
Feb 23 09:59:21 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:21.233 2 INFO neutron.agent.securitygroups_rpc [None req-f686a950-083b-4059-aa56-fe5b5d1b4f8c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3964395073' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3964395073' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2346079799' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2346079799' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2453675004' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2453675004' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:21 np0005626463.localdomain sudo[317152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 09:59:21 np0005626463.localdomain sudo[317152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:59:21 np0005626463.localdomain sudo[317152]: pam_unix(sudo:session): session closed for user root
Feb 23 09:59:21 np0005626463.localdomain sudo[317170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 09:59:21 np0005626463.localdomain sudo[317170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 do_prune osdmap full prune enabled
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e152 e152: 6 total, 6 up, 6 in
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: pgmap v316: 177 pgs: 177 active+clean; 145 MiB data, 838 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 6.6 KiB/s wr, 110 op/s
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2984075853' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: osdmap e152: 6 total, 6 up, 6 in
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1542723496' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1542723496' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:22 np0005626463.localdomain sudo[317170]: pam_unix(sudo:session): session closed for user root
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 09:59:22 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:59:22 np0005626463.localdomain sudo[317219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 09:59:22 np0005626463.localdomain sudo[317219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 09:59:22 np0005626463.localdomain sudo[317219]: pam_unix(sudo:session): session closed for user root
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 09:59:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.232 265541 INFO neutron.agent.linux.ip_lib [None req-3daf7321-3b3e-4175-8d98-c073a2c48a90 - - - - - -] Device tap7d2d4ed7-8f cannot be used as it has no MAC address
Feb 23 09:59:23 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:23.258 2 INFO neutron.agent.securitygroups_rpc [req-d0cb4007-c7bf-4f23-9a12-bffb679ca45d req-9d22fdae-ec49-499d-83f5-abbd29e1424d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group member updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.293 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain kernel: device tap7d2d4ed7-8f entered promiscuous mode
Feb 23 09:59:23 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840763.3025] manager: (tap7d2d4ed7-8f): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:23Z|00279|binding|INFO|Claiming lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 for this chassis.
Feb 23 09:59:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:23Z|00280|binding|INFO|7d2d4ed7-8f1a-44e5-99e2-954f427618a6: Claiming unknown
Feb 23 09:59:23 np0005626463.localdomain systemd-udevd[317247]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:59:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:23.316 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18c7d4ac-b41a-428e-8eb5-19914e19db45, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=7d2d4ed7-8f1a-44e5-99e2-954f427618a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:23.318 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 in datapath 0c26ff8d-9894-4f75-a5d3-33bc934f6b99 bound to our chassis
Feb 23 09:59:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:23.321 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port df6f2a94-b7ca-4c07-965a-6e1d3125157c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:59:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:23.321 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:23 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:23.323 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8369fffa-d8e7-4de6-b04e-ed0f13a1caa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.320 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:22Z, description=, device_id=eb34b095-bb71-40c8-bef1-74bba1c6b6f7, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282939cdc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282939c310>], id=76b4fad5-6bb5-46d7-9cb7-fb6f22f09785, ip_allocation=immediate, mac_address=fa:16:3e:df:92:41, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c46df023-9a3e-4c54-a0bb-44b675220af4'], standard_attr_id=2426, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:59:22Z on network ff7aa220-5765-44c6-9121-cfbd718241c5
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.333 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:23Z|00281|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 ovn-installed in OVS
Feb 23 09:59:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:23Z|00282|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 up in Southbound
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.344 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.387 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:23.398 2 INFO neutron.agent.securitygroups_rpc [None req-669db256-0e1b-436d-bf03-d63867ff4f10 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.403 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.404 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.405 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.408 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 09:59:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:23.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:59:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:23 np0005626463.localdomain systemd[1]: tmp-crun.OenR2H.mount: Deactivated successfully.
Feb 23 09:59:23 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 2 addresses
Feb 23 09:59:23 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 09:59:23 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 09:59:23 np0005626463.localdomain podman[317290]: 2026-02-23 09:59:23.556057211 +0000 UTC m=+0.066975625 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.914 265541 INFO neutron.agent.dhcp.agent [None req-fe33f44d-bb0a-4789-9e57-6d493b09e65c - - - - - -] DHCP configuration for ports {'76b4fad5-6bb5-46d7-9cb7-fb6f22f09785'} is completed
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: pgmap v318: 177 pgs: 177 active+clean; 145 MiB data, 835 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 7.6 KiB/s wr, 151 op/s
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/423842423' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/423842423' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:24 np0005626463.localdomain podman[317354]: 
Feb 23 09:59:24 np0005626463.localdomain podman[317354]: 2026-02-23 09:59:24.409554766 +0000 UTC m=+0.093099136 container create c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 23 09:59:24 np0005626463.localdomain systemd[1]: Started libpod-conmon-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope.
Feb 23 09:59:24 np0005626463.localdomain systemd[1]: tmp-crun.qwq38d.mount: Deactivated successfully.
Feb 23 09:59:24 np0005626463.localdomain podman[317354]: 2026-02-23 09:59:24.365141091 +0000 UTC m=+0.048685451 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:59:24 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:59:24 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0aceec28e8080a4a37142df66778b1baec5d9c5e7130e92c6c16ab6cf85905ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:59:24 np0005626463.localdomain podman[317354]: 2026-02-23 09:59:24.49692725 +0000 UTC m=+0.180471600 container init c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 09:59:24 np0005626463.localdomain podman[317354]: 2026-02-23 09:59:24.506069327 +0000 UTC m=+0.189613657 container start c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 09:59:24 np0005626463.localdomain dnsmasq[317373]: started, version 2.85 cachesize 150
Feb 23 09:59:24 np0005626463.localdomain dnsmasq[317373]: DNS service limited to local subnets
Feb 23 09:59:24 np0005626463.localdomain dnsmasq[317373]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:59:24 np0005626463.localdomain dnsmasq[317373]: warning: no upstream servers configured
Feb 23 09:59:24 np0005626463.localdomain dnsmasq-dhcp[317373]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:59:24 np0005626463.localdomain dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 0 addresses
Feb 23 09:59:24 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host
Feb 23 09:59:24 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e152 do_prune osdmap full prune enabled
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 e153: 6 total, 6 up, 6 in
Feb 23 09:59:24 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.715 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.739 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.739 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.740 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.740 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.741 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:24.741 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 09:59:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:24.816 265541 INFO neutron.agent.dhcp.agent [None req-f1954efa-b521-4467-86ff-026fae6d0d60 - - - - - -] DHCP configuration for ports {'e5d210d8-ac9c-41fd-8d79-4fc011a4e59f'} is completed
Feb 23 09:59:24 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:24.947 2 INFO neutron.agent.securitygroups_rpc [None req-f9eeb769-9fe9-4ea4-8c2c-b3370b8aaa5a 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:25 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:25.282 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005626466.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:22Z, description=, device_id=eb34b095-bb71-40c8-bef1-74bba1c6b6f7, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829309bb0>], dns_domain=, dns_name=tempest-volumesbackupstest-instance-2145282493, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291ecb50>], id=76b4fad5-6bb5-46d7-9cb7-fb6f22f09785, ip_allocation=immediate, mac_address=fa:16:3e:df:92:41, name=, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c46df023-9a3e-4c54-a0bb-44b675220af4'], standard_attr_id=2426, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:59:24Z on network ff7aa220-5765-44c6-9121-cfbd718241c5
Feb 23 09:59:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1847854992' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:25 np0005626463.localdomain ceph-mon[294160]: osdmap e153: 6 total, 6 up, 6 in
Feb 23 09:59:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1024019890' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:25 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 2 addresses
Feb 23 09:59:25 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 09:59:25 np0005626463.localdomain podman[317391]: 2026-02-23 09:59:25.49196777 +0000 UTC m=+0.058583751 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 09:59:25 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 09:59:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 09:59:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:59:25 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:25.745 265541 INFO neutron.agent.dhcp.agent [None req-fdb320db-3739-482e-9ba7-1e06f4355dcd - - - - - -] DHCP configuration for ports {'76b4fad5-6bb5-46d7-9cb7-fb6f22f09785'} is completed
Feb 23 09:59:25 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:25.779 2 INFO neutron.agent.securitygroups_rpc [None req-451b3e02-3a26-4970-9980-9e73ed6341a9 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:26 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:26.110 2 INFO neutron.agent.securitygroups_rpc [None req-69ced21c-6b6a-47d6-87bf-2537c99ace20 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:26.194 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:26 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:26.210 2 INFO neutron.agent.securitygroups_rpc [None req-12deaede-70c5-4b22-963c-9ca013b19b91 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:26 np0005626463.localdomain ceph-mon[294160]: pgmap v320: 177 pgs: 177 active+clean; 145 MiB data, 835 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 3.5 KiB/s wr, 80 op/s
Feb 23 09:59:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1603226869' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 09:59:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:27.096 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2005301742' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2005301742' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3917936370' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:27 np0005626463.localdomain ceph-mon[294160]: pgmap v321: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 151 KiB/s rd, 2.7 MiB/s wr, 212 op/s
Feb 23 09:59:27 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:27.419 2 INFO neutron.agent.securitygroups_rpc [None req-3c7a21a7-3db2-45cc-9718-507bcfbefba1 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:27 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:27.684 2 INFO neutron.agent.securitygroups_rpc [None req-d9a86df9-d280-468f-9907-620800eed5df 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.334 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.342 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.394 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.395 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.395 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.396 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.396 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:59:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:59:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/427022608' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.782 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:59:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.853 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:59:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:28.853 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 09:59:28 np0005626463.localdomain systemd[1]: tmp-crun.9T4ZDT.mount: Deactivated successfully.
Feb 23 09:59:28 np0005626463.localdomain podman[317434]: 2026-02-23 09:59:28.909345245 +0000 UTC m=+0.088728928 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:59:28 np0005626463.localdomain podman[317434]: 2026-02-23 09:59:28.915584071 +0000 UTC m=+0.094967754 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 09:59:28 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.048 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.051 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11325MB free_disk=41.7744026184082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.051 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.052 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:59:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/427022608' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2230961610' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.348 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.348 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.349 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.411 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.470 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.471 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.494 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.518 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 09:59:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:29.563 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 09:59:29 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:29.758 2 INFO neutron.agent.securitygroups_rpc [None req-79540abb-7fc2-40cd-b9ab-4b003306a8d3 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:29 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 09:59:29 np0005626463.localdomain podman[317476]: 2026-02-23 09:59:29.910181408 +0000 UTC m=+0.088356096 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 09:59:29 np0005626463.localdomain podman[317476]: 2026-02-23 09:59:29.947344024 +0000 UTC m=+0.125518692 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, version=9.7)
Feb 23 09:59:29 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 09:59:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 09:59:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1605527560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:30.068 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 09:59:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:30.075 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 09:59:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:30.090 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 09:59:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:30.093 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 09:59:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:30.093 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:59:30 np0005626463.localdomain ceph-mon[294160]: pgmap v322: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 122 KiB/s rd, 2.7 MiB/s wr, 172 op/s
Feb 23 09:59:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2149651166' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1605527560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 09:59:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:31.090 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:31.091 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:31 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:31.264 2 INFO neutron.agent.securitygroups_rpc [None req-5bebbbf2-890e-4b6f-92fc-689753c8df12 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:31 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:31.363 2 INFO neutron.agent.securitygroups_rpc [None req-6d65bfbe-b42e-4b67-8b3f-02498c056686 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 do_prune osdmap full prune enabled
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e154 e154: 6 total, 6 up, 6 in
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in
Feb 23 09:59:32 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:32.242 2 INFO neutron.agent.securitygroups_rpc [None req-b9f58e08-6424-4ab4-86b8-227dc5c4bdfc 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:32 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:32.259 2 INFO neutron.agent.securitygroups_rpc [None req-9acdbf04-3277-47bc-8ea9-7f279ed4a9a4 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: pgmap v323: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 2.3 MiB/s wr, 122 op/s
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2785896045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2785896045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:32 np0005626463.localdomain ceph-mon[294160]: osdmap e154: 6 total, 6 up, 6 in
Feb 23 09:59:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:32.337 265541 INFO neutron.agent.linux.ip_lib [None req-ce05a959-3235-471f-bc6b-decd45cf34af - - - - - -] Device tapeff7547d-16 cannot be used as it has no MAC address
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.358 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.359 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:32 np0005626463.localdomain kernel: device tapeff7547d-16 entered promiscuous mode
Feb 23 09:59:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:32Z|00283|binding|INFO|Claiming lport eff7547d-1684-4a00-829e-9369e5af1a4c for this chassis.
Feb 23 09:59:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:32Z|00284|binding|INFO|eff7547d-1684-4a00-829e-9369e5af1a4c: Claiming unknown
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.426 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:32 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840772.4282] manager: (tapeff7547d-16): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Feb 23 09:59:32 np0005626463.localdomain systemd-udevd[317508]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.437 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=891eac08-7b13-4a8b-bfb9-8821cd51b516, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=eff7547d-1684-4a00-829e-9369e5af1a4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.439 163572 INFO neutron.agent.ovn.metadata.agent [-] Port eff7547d-1684-4a00-829e-9369e5af1a4c in datapath d9ae15ed-aa9d-4bce-9192-334c9725a10c bound to our chassis
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.441 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d9ae15ed-aa9d-4bce-9192-334c9725a10c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:59:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:32.442 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5e81222e-caeb-4eb5-b178-435743de9a4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:32Z|00285|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c ovn-installed in OVS
Feb 23 09:59:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:32Z|00286|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c up in Southbound
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapeff7547d-16: No such device
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.507 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:32.536 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:33.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:33.118 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:33.336 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:33.347 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:33 np0005626463.localdomain podman[317578]: 
Feb 23 09:59:33 np0005626463.localdomain podman[317578]: 2026-02-23 09:59:33.429837607 +0000 UTC m=+0.102735718 container create 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216)
Feb 23 09:59:33 np0005626463.localdomain podman[317578]: 2026-02-23 09:59:33.379509156 +0000 UTC m=+0.052407317 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:59:33 np0005626463.localdomain systemd[1]: Started libpod-conmon-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope.
Feb 23 09:59:33 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:59:33 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182134e9323da2a38d8aba6d82a6ee7725f82a27095416b853817f55614cd542/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:59:33 np0005626463.localdomain podman[317578]: 2026-02-23 09:59:33.512726239 +0000 UTC m=+0.185624350 container init 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216)
Feb 23 09:59:33 np0005626463.localdomain podman[317578]: 2026-02-23 09:59:33.522352012 +0000 UTC m=+0.195250123 container start 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 23 09:59:33 np0005626463.localdomain dnsmasq[317597]: started, version 2.85 cachesize 150
Feb 23 09:59:33 np0005626463.localdomain dnsmasq[317597]: DNS service limited to local subnets
Feb 23 09:59:33 np0005626463.localdomain dnsmasq[317597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:59:33 np0005626463.localdomain dnsmasq[317597]: warning: no upstream servers configured
Feb 23 09:59:33 np0005626463.localdomain dnsmasq-dhcp[317597]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 09:59:33 np0005626463.localdomain dnsmasq[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/addn_hosts - 0 addresses
Feb 23 09:59:33 np0005626463.localdomain dnsmasq-dhcp[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/host
Feb 23 09:59:33 np0005626463.localdomain dnsmasq-dhcp[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/opts
Feb 23 09:59:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:33.656 265541 INFO neutron.agent.dhcp.agent [None req-e5840c49-f204-4b66-a790-f8a94ecb8e75 - - - - - -] DHCP configuration for ports {'c33a8bfd-5ed5-47c7-86ae-d437d28631db'} is completed
Feb 23 09:59:33 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:33.785 2 INFO neutron.agent.securitygroups_rpc [None req-fd6425dd-2055-4cbb-a840-beb509d9498c 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:33.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e154 do_prune osdmap full prune enabled
Feb 23 09:59:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e155 e155: 6 total, 6 up, 6 in
Feb 23 09:59:34 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in
Feb 23 09:59:34 np0005626463.localdomain ceph-mon[294160]: pgmap v325: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 110 KiB/s rd, 2.5 MiB/s wr, 154 op/s
Feb 23 09:59:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:35.076 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:35 np0005626463.localdomain ceph-mon[294160]: osdmap e155: 6 total, 6 up, 6 in
Feb 23 09:59:35 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:35.953 2 INFO neutron.agent.securitygroups_rpc [None req-98b565c4-bd61-4452-bccd-bd5e5d274484 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:36 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:36.038 2 INFO neutron.agent.securitygroups_rpc [None req-1e0c6ce2-0fa1-4c95-beb4-0c824ad3b485 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e155 do_prune osdmap full prune enabled
Feb 23 09:59:36 np0005626463.localdomain ceph-mon[294160]: pgmap v327: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 22 KiB/s wr, 33 op/s
Feb 23 09:59:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 e156: 6 total, 6 up, 6 in
Feb 23 09:59:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in
Feb 23 09:59:36 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:36.627 2 INFO neutron.agent.securitygroups_rpc [None req-8ded7049-1595-4d50-af9e-057a8bcc7a90 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:36 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:36.810 2 INFO neutron.agent.securitygroups_rpc [None req-0d2af92f-c1c3-481d-a685-438d421cfdf3 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.121083) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777121148, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2135, "num_deletes": 263, "total_data_size": 2136994, "memory_usage": 2193680, "flush_reason": "Manual Compaction"}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777134616, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2069328, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27575, "largest_seqno": 29709, "table_properties": {"data_size": 2060472, "index_size": 5429, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19826, "raw_average_key_size": 20, "raw_value_size": 2041984, "raw_average_value_size": 2158, "num_data_blocks": 236, "num_entries": 946, "num_filter_entries": 946, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840639, "oldest_key_time": 1771840639, "file_creation_time": 1771840777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 13575 microseconds, and 5564 cpu microseconds.
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.134666) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2069328 bytes OK
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.134687) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137076) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137096) EVENT_LOG_v1 {"time_micros": 1771840777137090, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137120) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2127774, prev total WAL file size 2128379, number of live WAL files 2.
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137924) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303139' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end)
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2020KB)], [48(16MB)]
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777137984, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 18874849, "oldest_snapshot_seqno": -1}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12994 keys, 18375915 bytes, temperature: kUnknown
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777241501, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 18375915, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18298438, "index_size": 43904, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32517, "raw_key_size": 347070, "raw_average_key_size": 26, "raw_value_size": 18073886, "raw_average_value_size": 1390, "num_data_blocks": 1680, "num_entries": 12994, "num_filter_entries": 12994, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.241826) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 18375915 bytes
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.243971) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.2 rd, 177.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.0 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(18.0) write-amplify(8.9) OK, records in: 13535, records dropped: 541 output_compression: NoCompression
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.243998) EVENT_LOG_v1 {"time_micros": 1771840777243986, "job": 28, "event": "compaction_finished", "compaction_time_micros": 103592, "compaction_time_cpu_micros": 53826, "output_level": 6, "num_output_files": 1, "total_output_size": 18375915, "num_input_records": 13535, "num_output_records": 12994, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777244801, "job": 28, "event": "table_file_deletion", "file_number": 50}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777248215, "job": 28, "event": "table_file_deletion", "file_number": 48}
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 do_prune osdmap full prune enabled
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e157 e157: 6 total, 6 up, 6 in
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: osdmap e156: 6 total, 6 up, 6 in
Feb 23 09:59:37 np0005626463.localdomain ceph-mon[294160]: pgmap v329: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 33 KiB/s wr, 197 op/s
Feb 23 09:59:37 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:37.957 2 INFO neutron.agent.securitygroups_rpc [None req-4c34a092-1bee-4730-b083-a159f9af8bdd 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']
Feb 23 09:59:38 np0005626463.localdomain sshd[317598]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 09:59:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:38.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 09:59:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:38.079 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 09:59:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:38.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:38.370 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e157 do_prune osdmap full prune enabled
Feb 23 09:59:38 np0005626463.localdomain ceph-mon[294160]: osdmap e157: 6 total, 6 up, 6 in
Feb 23 09:59:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e158 e158: 6 total, 6 up, 6 in
Feb 23 09:59:38 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in
Feb 23 09:59:38 np0005626463.localdomain sshd[317598]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 09:59:38 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:38.598 2 INFO neutron.agent.securitygroups_rpc [None req-32c78c98-6e6f-4ee3-a8c4-559f4243f6cc 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:38 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:38.977 2 INFO neutron.agent.securitygroups_rpc [None req-6b31b9c1-f7cb-4728-a77a-3dbb7699a58d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:39 np0005626463.localdomain podman[242954]: time="2026-02-23T09:59:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 09:59:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:59:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162552 "" "Go-http-client/1.1"
Feb 23 09:59:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e158 do_prune osdmap full prune enabled
Feb 23 09:59:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:09:59:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20248 "" "Go-http-client/1.1"
Feb 23 09:59:39 np0005626463.localdomain ceph-mon[294160]: osdmap e158: 6 total, 6 up, 6 in
Feb 23 09:59:39 np0005626463.localdomain ceph-mon[294160]: pgmap v332: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 4.5 KiB/s wr, 197 op/s
Feb 23 09:59:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e159 e159: 6 total, 6 up, 6 in
Feb 23 09:59:39 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in
Feb 23 09:59:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:40.362 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 09:59:40 np0005626463.localdomain ceph-mon[294160]: osdmap e159: 6 total, 6 up, 6 in
Feb 23 09:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 09:59:40 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 09:59:40 np0005626463.localdomain systemd[1]: tmp-crun.HSRSWQ.mount: Deactivated successfully.
Feb 23 09:59:40 np0005626463.localdomain podman[317601]: 2026-02-23 09:59:40.97438535 +0000 UTC m=+0.140841645 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 09:59:40 np0005626463.localdomain podman[317601]: 2026-02-23 09:59:40.982376222 +0000 UTC m=+0.148832537 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 09:59:40 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 09:59:41 np0005626463.localdomain podman[317600]: 2026-02-23 09:59:40.938181683 +0000 UTC m=+0.109043655 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:59:41 np0005626463.localdomain podman[317600]: 2026-02-23 09:59:41.06800895 +0000 UTC m=+0.238870992 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216)
Feb 23 09:59:41 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e159 do_prune osdmap full prune enabled
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 e160: 6 total, 6 up, 6 in
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/498547519' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/498547519' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:41 np0005626463.localdomain ceph-mon[294160]: pgmap v334: 177 pgs: 177 active+clean; 216 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 5.0 MiB/s wr, 341 op/s
Feb 23 09:59:41 np0005626463.localdomain systemd[1]: tmp-crun.5SI1FK.mount: Deactivated successfully.
Feb 23 09:59:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:42 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:42.410 2 INFO neutron.agent.securitygroups_rpc [None req-ea1ece96-8ed7-4628-9eb3-8fcb879af238 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:42 np0005626463.localdomain ceph-mon[294160]: osdmap e160: 6 total, 6 up, 6 in
Feb 23 09:59:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:43.371 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:59:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 09:59:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:59:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   09:59:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 09:59:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 09:59:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:43.380 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:43.391 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:42Z, description=, device_id=e923d3f9-631b-4cb5-b450-aef4f64e2d2c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829189fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829189d90>], id=b75036d4-c2df-48cf-944c-8a783718cf0a, ip_allocation=immediate, mac_address=fa:16:3e:53:1e:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:59:20Z, description=, dns_domain=, id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1148185001, port_security_enabled=True, project_id=20ee87502ddb4dde82419e1f4302f590, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48707, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2410, status=ACTIVE, subnets=['4203c185-aad1-4b87-8334-57a63d4b2075'], tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:21Z, vlan_transparent=None, network_id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, port_security_enabled=False, project_id=20ee87502ddb4dde82419e1f4302f590, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2498, status=DOWN, tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:42Z on network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99
Feb 23 09:59:43 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:43.477 2 INFO neutron.agent.securitygroups_rpc [None req-cd7d52bd-6d88-4de8-a801-ff1f615f2e4b 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:43 np0005626463.localdomain podman[317662]: 2026-02-23 09:59:43.655337368 +0000 UTC m=+0.047533994 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 09:59:43 np0005626463.localdomain dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 1 addresses
Feb 23 09:59:43 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host
Feb 23 09:59:43 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts
Feb 23 09:59:43 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:43.842 265541 INFO neutron.agent.dhcp.agent [None req-1578ebdc-a17c-47a5-9fd6-8d19d4f96456 - - - - - -] DHCP configuration for ports {'b75036d4-c2df-48cf-944c-8a783718cf0a'} is completed
Feb 23 09:59:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 do_prune osdmap full prune enabled
Feb 23 09:59:43 np0005626463.localdomain ceph-mon[294160]: pgmap v336: 177 pgs: 177 active+clean; 288 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 17 MiB/s wr, 223 op/s
Feb 23 09:59:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e161 e161: 6 total, 6 up, 6 in
Feb 23 09:59:43 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in
Feb 23 09:59:44 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:44.576 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:42Z, description=, device_id=e923d3f9-631b-4cb5-b450-aef4f64e2d2c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829198d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829198610>], id=b75036d4-c2df-48cf-944c-8a783718cf0a, ip_allocation=immediate, mac_address=fa:16:3e:53:1e:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:59:20Z, description=, dns_domain=, id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1148185001, port_security_enabled=True, project_id=20ee87502ddb4dde82419e1f4302f590, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48707, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2410, status=ACTIVE, subnets=['4203c185-aad1-4b87-8334-57a63d4b2075'], tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:21Z, vlan_transparent=None, network_id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, port_security_enabled=False, project_id=20ee87502ddb4dde82419e1f4302f590, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2498, status=DOWN, tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:42Z on network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99
Feb 23 09:59:44 np0005626463.localdomain dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 1 addresses
Feb 23 09:59:44 np0005626463.localdomain podman[317700]: 2026-02-23 09:59:44.809474614 +0000 UTC m=+0.061482131 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:59:44 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host
Feb 23 09:59:44 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts
Feb 23 09:59:44 np0005626463.localdomain ceph-mon[294160]: osdmap e161: 6 total, 6 up, 6 in
Feb 23 09:59:45 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:45.162 265541 INFO neutron.agent.dhcp.agent [None req-f9af8222-d352-4117-a3c1-39f02d72ab62 - - - - - -] DHCP configuration for ports {'b75036d4-c2df-48cf-944c-8a783718cf0a'} is completed
Feb 23 09:59:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 09:59:45 np0005626463.localdomain podman[317720]: 2026-02-23 09:59:45.925958388 +0000 UTC m=+0.092560608 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Feb 23 09:59:45 np0005626463.localdomain podman[317720]: 2026-02-23 09:59:45.966369288 +0000 UTC m=+0.132971548 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:45 np0005626463.localdomain ceph-mon[294160]: pgmap v338: 177 pgs: 177 active+clean; 288 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 16 MiB/s wr, 216 op/s
Feb 23 09:59:45 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 09:59:46 np0005626463.localdomain dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 0 addresses
Feb 23 09:59:46 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host
Feb 23 09:59:46 np0005626463.localdomain dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts
Feb 23 09:59:46 np0005626463.localdomain podman[317754]: 2026-02-23 09:59:46.561977073 +0000 UTC m=+0.058353054 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e161 do_prune osdmap full prune enabled
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 e162: 6 total, 6 up, 6 in
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 do_prune osdmap full prune enabled
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 e163: 6 total, 6 up, 6 in
Feb 23 09:59:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in
Feb 23 09:59:47 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:47.282 265541 INFO neutron.agent.linux.ip_lib [None req-b474b464-2d05-4bfb-a71a-3e1191b45e29 - - - - - -] Device tapcf62e779-21 cannot be used as it has no MAC address
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.343 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain kernel: device tapcf62e779-21 entered promiscuous mode
Feb 23 09:59:47 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840787.3507] manager: (tapcf62e779-21): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.350 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00287|binding|INFO|Claiming lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 for this chassis.
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00288|binding|INFO|cf62e779-21c9-44d1-992a-8d67e75ee9a4: Claiming unknown
Feb 23 09:59:47 np0005626463.localdomain systemd-udevd[317784]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.360 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5e40f037-bddd-4e41-9358-072288273862', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e40f037-bddd-4e41-9358-072288273862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd636fd1-c9ff-44d0-b6d5-3a4c5f8e69de, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=cf62e779-21c9-44d1-992a-8d67e75ee9a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.362 163572 INFO neutron.agent.ovn.metadata.agent [-] Port cf62e779-21c9-44d1-992a-8d67e75ee9a4 in datapath 5e40f037-bddd-4e41-9358-072288273862 bound to our chassis
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.364 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034ef5b0-24bc-4eeb-b7eb-fa73747ebcf1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.364 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e40f037-bddd-4e41-9358-072288273862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.365 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ad03834a-1390-48cb-8b71-f95cfc79a5c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00289|binding|INFO|Setting lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 ovn-installed in OVS
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00290|binding|INFO|Setting lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 up in Southbound
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain kernel: device tap7d2d4ed7-8f left promiscuous mode
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00291|binding|INFO|Releasing lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 from this chassis (sb_readonly=0)
Feb 23 09:59:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:47Z|00292|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 down in Southbound
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.432 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18c7d4ac-b41a-428e-8eb5-19914e19db45, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=7d2d4ed7-8f1a-44e5-99e2-954f427618a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.433 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 in datapath 0c26ff8d-9894-4f75-a5d3-33bc934f6b99 unbound from our chassis
Feb 23 09:59:47 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapcf62e779-21: No such device
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.435 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:47 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:47.436 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[be8de939-bd72-48c9-a60a-34a3273bff0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.464 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.818 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.846 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.847 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.847 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:59:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:47.876 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:59:48 np0005626463.localdomain ceph-mon[294160]: osdmap e162: 6 total, 6 up, 6 in
Feb 23 09:59:48 np0005626463.localdomain ceph-mon[294160]: osdmap e163: 6 total, 6 up, 6 in
Feb 23 09:59:48 np0005626463.localdomain ceph-mon[294160]: pgmap v341: 177 pgs: 177 active+clean; 479 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 2.9 MiB/s rd, 41 MiB/s wr, 376 op/s
Feb 23 09:59:48 np0005626463.localdomain podman[317856]: 
Feb 23 09:59:48 np0005626463.localdomain podman[317856]: 2026-02-23 09:59:48.345090174 +0000 UTC m=+0.099851548 container create 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 23 09:59:48 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 09:59:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:48.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:48.380 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:48 np0005626463.localdomain systemd[1]: Started libpod-conmon-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope.
Feb 23 09:59:48 np0005626463.localdomain podman[317856]: 2026-02-23 09:59:48.296031802 +0000 UTC m=+0.050793226 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 09:59:48 np0005626463.localdomain systemd[1]: tmp-crun.lOjpOE.mount: Deactivated successfully.
Feb 23 09:59:48 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 09:59:48 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db56fcd2726acc1983ef9844d0380fb1390c65eda950b0c88ea7f886d5e3e2ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 09:59:48 np0005626463.localdomain podman[317856]: 2026-02-23 09:59:48.441807041 +0000 UTC m=+0.196568425 container init 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 09:59:48 np0005626463.localdomain dnsmasq[317885]: started, version 2.85 cachesize 150
Feb 23 09:59:48 np0005626463.localdomain dnsmasq[317885]: DNS service limited to local subnets
Feb 23 09:59:48 np0005626463.localdomain dnsmasq[317885]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 09:59:48 np0005626463.localdomain dnsmasq[317885]: warning: no upstream servers configured
Feb 23 09:59:48 np0005626463.localdomain dnsmasq-dhcp[317885]: DHCP, static leases only on 10.100.255.240, lease time 1d
Feb 23 09:59:48 np0005626463.localdomain dnsmasq[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/addn_hosts - 0 addresses
Feb 23 09:59:48 np0005626463.localdomain dnsmasq-dhcp[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/host
Feb 23 09:59:48 np0005626463.localdomain dnsmasq-dhcp[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/opts
Feb 23 09:59:48 np0005626463.localdomain podman[317869]: 2026-02-23 09:59:48.486726841 +0000 UTC m=+0.102472208 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 09:59:48 np0005626463.localdomain podman[317856]: 2026-02-23 09:59:48.502537868 +0000 UTC m=+0.257299242 container start 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:59:48 np0005626463.localdomain podman[317869]: 2026-02-23 09:59:48.520231863 +0000 UTC m=+0.135977250 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 23 09:59:48 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 09:59:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 09:59:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 09:59:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 09:59:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:48.722 2 INFO neutron.agent.securitygroups_rpc [None req-48538392-c56b-4b21-9b40-cb72b13d2341 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:48 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:48.776 265541 INFO neutron.agent.dhcp.agent [None req-c84393dc-9795-449b-be53-2b977d3475e7 - - - - - -] DHCP configuration for ports {'580598fc-fc38-454c-a9c1-155d41252ed8'} is completed
Feb 23 09:59:49 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1701378987' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:49 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1701378987' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:49.352 2 INFO neutron.agent.securitygroups_rpc [None req-fe77e4f6-f29f-4a3f-9227-290ccdc2ae2e 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:49 np0005626463.localdomain dnsmasq[317597]: exiting on receipt of SIGTERM
Feb 23 09:59:49 np0005626463.localdomain podman[317911]: 2026-02-23 09:59:49.67184569 +0000 UTC m=+0.066237990 container kill 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 09:59:49 np0005626463.localdomain systemd[1]: tmp-crun.vGg7jO.mount: Deactivated successfully.
Feb 23 09:59:49 np0005626463.localdomain systemd[1]: libpod-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope: Deactivated successfully.
Feb 23 09:59:49 np0005626463.localdomain podman[317926]: 2026-02-23 09:59:49.749499139 +0000 UTC m=+0.063200545 container died 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 09:59:49 np0005626463.localdomain podman[317926]: 2026-02-23 09:59:49.784685655 +0000 UTC m=+0.098387021 container cleanup 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 09:59:49 np0005626463.localdomain systemd[1]: libpod-conmon-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope: Deactivated successfully.
Feb 23 09:59:49 np0005626463.localdomain podman[317928]: 2026-02-23 09:59:49.8374079 +0000 UTC m=+0.141294808 container remove 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 09:59:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:49Z|00293|binding|INFO|Releasing lport eff7547d-1684-4a00-829e-9369e5af1a4c from this chassis (sb_readonly=0)
Feb 23 09:59:49 np0005626463.localdomain kernel: device tapeff7547d-16 left promiscuous mode
Feb 23 09:59:49 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:49Z|00294|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c down in Southbound
Feb 23 09:59:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:49.901 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:49.919 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=891eac08-7b13-4a8b-bfb9-8821cd51b516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=eff7547d-1684-4a00-829e-9369e5af1a4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:49.921 163572 INFO neutron.agent.ovn.metadata.agent [-] Port eff7547d-1684-4a00-829e-9369e5af1a4c in datapath d9ae15ed-aa9d-4bce-9192-334c9725a10c unbound from our chassis
Feb 23 09:59:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:49.924 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9ae15ed-aa9d-4bce-9192-334c9725a10c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:49 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:49.925 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[827accd0-cf3d-41c3-b33b-9c80310624ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:49.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:50 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/314257030' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:50 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/314257030' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:50 np0005626463.localdomain ceph-mon[294160]: pgmap v342: 177 pgs: 177 active+clean; 479 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 702 KiB/s rd, 26 MiB/s wr, 304 op/s
Feb 23 09:59:50 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/144846476' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:50 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/144846476' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.153 265541 INFO neutron.agent.dhcp.agent [None req-c10188fe-5a98-4e71-ab93-ea285fb52e20 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.154 265541 INFO neutron.agent.dhcp.agent [None req-c10188fe-5a98-4e71-ab93-ea285fb52e20 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-182134e9323da2a38d8aba6d82a6ee7725f82a27095416b853817f55614cd542-merged.mount: Deactivated successfully.
Feb 23 09:59:50 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242-userdata-shm.mount: Deactivated successfully.
Feb 23 09:59:50 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dd9ae15ed\x2daa9d\x2d4bce\x2d9192\x2d334c9725a10c.mount: Deactivated successfully.
Feb 23 09:59:50 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.548 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:50Z|00295|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:59:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:50.789 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:51 np0005626463.localdomain dnsmasq[317373]: exiting on receipt of SIGTERM
Feb 23 09:59:51 np0005626463.localdomain podman[317973]: 2026-02-23 09:59:51.826364706 +0000 UTC m=+0.059537711 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:51 np0005626463.localdomain systemd[1]: libpod-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope: Deactivated successfully.
Feb 23 09:59:51 np0005626463.localdomain podman[317987]: 2026-02-23 09:59:51.898346807 +0000 UTC m=+0.058585671 container died c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 09:59:51 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e-userdata-shm.mount: Deactivated successfully.
Feb 23 09:59:51 np0005626463.localdomain podman[317987]: 2026-02-23 09:59:51.931719545 +0000 UTC m=+0.091958369 container cleanup c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 23 09:59:51 np0005626463.localdomain systemd[1]: libpod-conmon-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope: Deactivated successfully.
Feb 23 09:59:51 np0005626463.localdomain podman[317989]: 2026-02-23 09:59:51.985309788 +0000 UTC m=+0.136993734 container remove c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0)
Feb 23 09:59:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:52.015 265541 INFO neutron.agent.dhcp.agent [None req-72146101-3418-4c72-af6e-09010364e98d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 do_prune osdmap full prune enabled
Feb 23 09:59:52 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:52.218 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e164 e164: 6 total, 6 up, 6 in
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.245447) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792245487, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 523, "num_deletes": 255, "total_data_size": 334964, "memory_usage": 344728, "flush_reason": "Manual Compaction"}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792251181, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 328652, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29710, "largest_seqno": 30232, "table_properties": {"data_size": 325799, "index_size": 836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7459, "raw_average_key_size": 20, "raw_value_size": 319910, "raw_average_value_size": 878, "num_data_blocks": 36, "num_entries": 364, "num_filter_entries": 364, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840777, "oldest_key_time": 1771840777, "file_creation_time": 1771840792, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 5775 microseconds, and 1799 cpu microseconds.
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.251223) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 328652 bytes OK
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.251245) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256409) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256431) EVENT_LOG_v1 {"time_micros": 1771840792256424, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256453) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 331876, prev total WAL file size 331876, number of live WAL files 2.
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.257135) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(320KB)], [51(17MB)]
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792257183, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18704567, "oldest_snapshot_seqno": -1}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: pgmap v343: 177 pgs: 177 active+clean; 543 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 632 KiB/s rd, 31 MiB/s wr, 330 op/s
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: osdmap e164: 6 total, 6 up, 6 in
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12833 keys, 17422861 bytes, temperature: kUnknown
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792380444, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17422861, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17347442, "index_size": 42230, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32133, "raw_key_size": 344271, "raw_average_key_size": 26, "raw_value_size": 17126615, "raw_average_value_size": 1334, "num_data_blocks": 1604, "num_entries": 12833, "num_filter_entries": 12833, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840792, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.381197) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17422861 bytes
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.385346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.7 rd, 141.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 17.5 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(109.9) write-amplify(53.0) OK, records in: 13358, records dropped: 525 output_compression: NoCompression
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.385377) EVENT_LOG_v1 {"time_micros": 1771840792385364, "job": 30, "event": "compaction_finished", "compaction_time_micros": 123324, "compaction_time_cpu_micros": 49010, "output_level": 6, "num_output_files": 1, "total_output_size": 17422861, "num_input_records": 13358, "num_output_records": 12833, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792385609, "job": 30, "event": "table_file_deletion", "file_number": 53}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792388831, "job": 30, "event": "table_file_deletion", "file_number": 51}
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.257047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:52 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0aceec28e8080a4a37142df66778b1baec5d9c5e7130e92c6c16ab6cf85905ce-merged.mount: Deactivated successfully.
Feb 23 09:59:52 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d0c26ff8d\x2d9894\x2d4f75\x2da5d3\x2d33bc934f6b99.mount: Deactivated successfully.
Feb 23 09:59:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:53.175 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8:0:1:f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:53.178 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated
Feb 23 09:59:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:53.182 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 09:59:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:53.183 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b705a777-2894-4fef-9fa7-aebd73e52ebe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:53 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2100994161' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:53.415 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:53 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:53.710 2 INFO neutron.agent.securitygroups_rpc [None req-e07daaf7-bd51-4e1d-9776-48aaf7c6d99d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:54 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:54.178 2 INFO neutron.agent.securitygroups_rpc [None req-0c26025a-3ebd-4293-969f-53f7049344a7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 09:59:54 np0005626463.localdomain ceph-mon[294160]: pgmap v345: 177 pgs: 177 active+clean; 615 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 126 KiB/s rd, 22 MiB/s wr, 173 op/s
Feb 23 09:59:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e164 do_prune osdmap full prune enabled
Feb 23 09:59:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e165 e165: 6 total, 6 up, 6 in
Feb 23 09:59:55 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3080484500' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 09:59:55 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e165 do_prune osdmap full prune enabled
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: pgmap v346: 177 pgs: 177 active+clean; 615 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 97 KiB/s rd, 17 MiB/s wr, 133 op/s
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: osdmap e165: 6 total, 6 up, 6 in
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/641710101' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/641710101' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 e166: 6 total, 6 up, 6 in
Feb 23 09:59:56 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.179132) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797179224, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 326, "num_deletes": 250, "total_data_size": 97591, "memory_usage": 103056, "flush_reason": "Manual Compaction"}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797182192, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 96107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30233, "largest_seqno": 30558, "table_properties": {"data_size": 94000, "index_size": 282, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6006, "raw_average_key_size": 20, "raw_value_size": 89689, "raw_average_value_size": 307, "num_data_blocks": 12, "num_entries": 292, "num_filter_entries": 292, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840792, "oldest_key_time": 1771840792, "file_creation_time": 1771840797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 3098 microseconds, and 1198 cpu microseconds.
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.182241) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 96107 bytes OK
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.182262) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184126) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184147) EVENT_LOG_v1 {"time_micros": 1771840797184141, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184169) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 95311, prev total WAL file size 95635, number of live WAL files 2.
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.187026) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303033' seq:72057594037927935, type:22 .. '6D6772737461740034323534' seq:0, type:0; will stop at (end)
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(93KB)], [54(16MB)]
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797187071, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17518968, "oldest_snapshot_seqno": -1}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 12613 keys, 15414183 bytes, temperature: kUnknown
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797287572, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 15414183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15345078, "index_size": 36492, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31557, "raw_key_size": 339949, "raw_average_key_size": 26, "raw_value_size": 15132880, "raw_average_value_size": 1199, "num_data_blocks": 1365, "num_entries": 12613, "num_filter_entries": 12613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.287963) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 15414183 bytes
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.289823) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.2 rd, 153.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 16.6 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(342.7) write-amplify(160.4) OK, records in: 13125, records dropped: 512 output_compression: NoCompression
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.289863) EVENT_LOG_v1 {"time_micros": 1771840797289849, "job": 32, "event": "compaction_finished", "compaction_time_micros": 100590, "compaction_time_cpu_micros": 46750, "output_level": 6, "num_output_files": 1, "total_output_size": 15414183, "num_input_records": 13125, "num_output_records": 12613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797290075, "job": 32, "event": "table_file_deletion", "file_number": 56}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797292423, "job": 32, "event": "table_file_deletion", "file_number": 54}
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.186943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292588) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 do_prune osdmap full prune enabled
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: osdmap e166: 6 total, 6 up, 6 in
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3653968853' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3653968853' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: pgmap v349: 177 pgs: 177 active+clean; 751 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 198 KiB/s rd, 35 MiB/s wr, 263 op/s
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e167 e167: 6 total, 6 up, 6 in
Feb 23 09:59:57 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in
Feb 23 09:59:57 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 09:59:57.935 2 INFO neutron.agent.securitygroups_rpc [None req-d119b339-8a48-47e8-bd68-4b4fb753cf46 92730c8dc08c46ec9f30a1ded731d654 b7501fe3a8904b43b875ec99452354a0 - - default default] Security group rule updated ['9373b311-126d-4dcc-ae54-c4b5d87c2dd5']
Feb 23 09:59:58 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 09:59:58.331 265541 INFO neutron.agent.linux.ip_lib [None req-f758b58b-9cee-4d2a-8524-163310e3b69a - - - - - -] Device tap5dac6bd1-d4 cannot be used as it has no MAC address
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.403 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain kernel: device tap5dac6bd1-d4 entered promiscuous mode
Feb 23 09:59:58 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840798.4100] manager: (tap5dac6bd1-d4): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:58Z|00296|binding|INFO|Claiming lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f for this chassis.
Feb 23 09:59:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:58Z|00297|binding|INFO|5dac6bd1-d4a3-4582-9c29-07477bd4a21f: Claiming unknown
Feb 23 09:59:58 np0005626463.localdomain systemd-udevd[318026]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.422 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:58.434 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5daa8e-ec49-444c-ae76-ef0b7a442dc0, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=5dac6bd1-d4a3-4582-9c29-07477bd4a21f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:58.436 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5dac6bd1-d4a3-4582-9c29-07477bd4a21f in datapath 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 bound to our chassis
Feb 23 09:59:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:58.438 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:59:58 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:58.439 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[52ac14c7-0f05-45b7-9c72-1d47bd56f871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:58Z|00298|binding|INFO|Setting lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f ovn-installed in OVS
Feb 23 09:59:58 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:58Z|00299|binding|INFO|Setting lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f up in Southbound
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.444 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.448 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain ceph-mon[294160]: osdmap e167: 6 total, 6 up, 6 in
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.488 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:58.514 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:59.596 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 10a3eba3-46b8-4384-8d8b-3be8944e41b3 with type ""
Feb 23 09:59:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:59Z|00300|binding|INFO|Removing iface tap5dac6bd1-d4 ovn-installed in OVS
Feb 23 09:59:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:59.598 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5daa8e-ec49-444c-ae76-ef0b7a442dc0, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=5dac6bd1-d4a3-4582-9c29-07477bd4a21f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 09:59:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:59Z|00301|binding|INFO|Removing lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f ovn-installed in OVS
Feb 23 09:59:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:59.600 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5dac6bd1-d4a3-4582-9c29-07477bd4a21f in datapath 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 unbound from our chassis
Feb 23 09:59:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:59.601 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 09:59:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T09:59:59Z|00302|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 09:59:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:59.943 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 09:59:59.945 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b86c58f7-8669-450c-8522-5db750187515]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 09:59:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 09:59:59.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 09:59:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 09:59:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:00:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 23 10:00:00 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:00.235 2 INFO neutron.agent.securitygroups_rpc [None req-d445cf08-2b11-449c-8097-70ad611d5c5d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 10:00:00 np0005626463.localdomain ceph-mon[294160]: pgmap v351: 177 pgs: 177 active+clean; 751 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 131 KiB/s rd, 23 MiB/s wr, 178 op/s
Feb 23 10:00:00 np0005626463.localdomain podman[318116]: 
Feb 23 10:00:00 np0005626463.localdomain podman[318090]: 2026-02-23 10:00:00.263057655 +0000 UTC m=+0.296420640 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:00:00 np0005626463.localdomain podman[318100]: 2026-02-23 10:00:00.344958527 +0000 UTC m=+0.344493149 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=)
Feb 23 10:00:00 np0005626463.localdomain podman[318116]: 2026-02-23 10:00:00.276644052 +0000 UTC m=+0.250661513 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:00:00 np0005626463.localdomain podman[318116]: 2026-02-23 10:00:00.373511915 +0000 UTC m=+0.347529336 container create 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:00:00 np0005626463.localdomain podman[318100]: 2026-02-23 10:00:00.390499138 +0000 UTC m=+0.390033760 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1770267347, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Feb 23 10:00:00 np0005626463.localdomain podman[318090]: 2026-02-23 10:00:00.397613121 +0000 UTC m=+0.430976146 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: Started libpod-conmon-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope.
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:00:00 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01d3abec598585e6b85eea6b9d1d90a4888675fb80fe787c6c9e936cebc8d754/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:00:00 np0005626463.localdomain podman[318116]: 2026-02-23 10:00:00.461553879 +0000 UTC m=+0.435571330 container init 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 10:00:00 np0005626463.localdomain podman[318116]: 2026-02-23 10:00:00.470357276 +0000 UTC m=+0.444374707 container start 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: started, version 2.85 cachesize 150
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: DNS service limited to local subnets
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: warning: no upstream servers configured
Feb 23 10:00:00 np0005626463.localdomain dnsmasq-dhcp[318156]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/addn_hosts - 0 addresses
Feb 23 10:00:00 np0005626463.localdomain dnsmasq-dhcp[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/host
Feb 23 10:00:00 np0005626463.localdomain dnsmasq-dhcp[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/opts
Feb 23 10:00:00 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:00.638 265541 INFO neutron.agent.dhcp.agent [None req-e0e9c5f7-ced6-4771-bf8f-3b9fe0e5d235 - - - - - -] DHCP configuration for ports {'8327b9e9-ba3e-419e-9c3e-f43bbcff0dd0'} is completed
Feb 23 10:00:00 np0005626463.localdomain dnsmasq[318156]: exiting on receipt of SIGTERM
Feb 23 10:00:00 np0005626463.localdomain podman[318174]: 2026-02-23 10:00:00.708626179 +0000 UTC m=+0.060711448 container kill 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: libpod-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope: Deactivated successfully.
Feb 23 10:00:00 np0005626463.localdomain podman[318187]: 2026-02-23 10:00:00.780681232 +0000 UTC m=+0.054751891 container died 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 10:00:00 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:00.807 2 INFO neutron.agent.securitygroups_rpc [None req-f716d86f-e353-494e-a028-e9ccf623762c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']
Feb 23 10:00:00 np0005626463.localdomain podman[318187]: 2026-02-23 10:00:00.86402295 +0000 UTC m=+0.138093559 container cleanup 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:00:00 np0005626463.localdomain systemd[1]: libpod-conmon-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope: Deactivated successfully.
Feb 23 10:00:00 np0005626463.localdomain podman[318188]: 2026-02-23 10:00:00.90447541 +0000 UTC m=+0.172859720 container remove 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:00:00 np0005626463.localdomain kernel: device tap5dac6bd1-d4 left promiscuous mode
Feb 23 10:00:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:00.953 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:00.965 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:01 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:01.000 265541 INFO neutron.agent.dhcp.agent [None req-2051c492-3c01-4c11-90ee-a3bfdd2c9f7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:01 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:01.001 265541 INFO neutron.agent.dhcp.agent [None req-2051c492-3c01-4c11-90ee-a3bfdd2c9f7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-01d3abec598585e6b85eea6b9d1d90a4888675fb80fe787c6c9e936cebc8d754-merged.mount: Deactivated successfully.
Feb 23 10:00:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913-userdata-shm.mount: Deactivated successfully.
Feb 23 10:00:01 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d1d4e57e1\x2d6bfe\x2d49bd\x2d917a\x2df0bd4f0ec451.mount: Deactivated successfully.
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1283440830' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/494365671' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/494365671' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e167 do_prune osdmap full prune enabled
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 e168: 6 total, 6 up, 6 in
Feb 23 10:00:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 do_prune osdmap full prune enabled
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e169 e169: 6 total, 6 up, 6 in
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: pgmap v352: 177 pgs: 177 active+clean; 807 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 160 KiB/s rd, 32 MiB/s wr, 225 op/s
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: osdmap e168: 6 total, 6 up, 6 in
Feb 23 10:00:02 np0005626463.localdomain ceph-mon[294160]: osdmap e169: 6 total, 6 up, 6 in
Feb 23 10:00:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:03.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e169 do_prune osdmap full prune enabled
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: pgmap v355: 177 pgs: 177 active+clean; 855 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 74 KiB/s rd, 17 MiB/s wr, 103 op/s
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/252796869' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/252796869' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 e170: 6 total, 6 up, 6 in
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2daba99f-25c4-4b16-a8ea-ce269d15600b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: osdmap e170: 6 total, 6 up, 6 in
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2946249327' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2946249327' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:05 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e46: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:00:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2daba99f-25c4-4b16-a8ea-ce269d15600b", "format": "json"}]: dispatch
Feb 23 10:00:06 np0005626463.localdomain ceph-mon[294160]: pgmap v357: 177 pgs: 177 active+clean; 855 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 74 KiB/s rd, 17 MiB/s wr, 103 op/s
Feb 23 10:00:06 np0005626463.localdomain ceph-mon[294160]: mgrmap e46: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:00:06 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2940900338' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 23 10:00:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 do_prune osdmap full prune enabled
Feb 23 10:00:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e171 e171: 6 total, 6 up, 6 in
Feb 23 10:00:07 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in
Feb 23 10:00:08 np0005626463.localdomain ceph-mon[294160]: pgmap v358: 177 pgs: 177 active+clean; 991 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 112 KiB/s rd, 31 MiB/s wr, 155 op/s
Feb 23 10:00:08 np0005626463.localdomain ceph-mon[294160]: osdmap e171: 6 total, 6 up, 6 in
Feb 23 10:00:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e171 do_prune osdmap full prune enabled
Feb 23 10:00:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e172 e172: 6 total, 6 up, 6 in
Feb 23 10:00:08 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.494 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:08.498 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:00:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:00:09 np0005626463.localdomain ceph-mon[294160]: osdmap e172: 6 total, 6 up, 6 in
Feb 23 10:00:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2daba99f-25c4-4b16-a8ea-ce269d15600b", "format": "json"}]: dispatch
Feb 23 10:00:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2daba99f-25c4-4b16-a8ea-ce269d15600b", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:09 np0005626463.localdomain ceph-mon[294160]: pgmap v361: 177 pgs: 177 active+clean; 991 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 66 KiB/s rd, 23 MiB/s wr, 98 op/s
Feb 23 10:00:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:00:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160733 "" "Go-http-client/1.1"
Feb 23 10:00:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:00:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19764 "" "Go-http-client/1.1"
Feb 23 10:00:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e172 do_prune osdmap full prune enabled
Feb 23 10:00:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e173 e173: 6 total, 6 up, 6 in
Feb 23 10:00:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in
Feb 23 10:00:10 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.169 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.173 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:00:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:00:11 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:00:11 np0005626463.localdomain podman[318217]: 2026-02-23 10:00:11.504620317 +0000 UTC m=+0.107304441 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Feb 23 10:00:11 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:11.515 265541 INFO neutron.agent.linux.ip_lib [None req-3c25de25-c79a-40f5-a7f7-41e6e4684a57 - - - - - -] Device tapa44c3247-d9 cannot be used as it has no MAC address
Feb 23 10:00:11 np0005626463.localdomain podman[318218]: 2026-02-23 10:00:11.555886647 +0000 UTC m=+0.158592902 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.559 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain kernel: device tapa44c3247-d9 entered promiscuous mode
Feb 23 10:00:11 np0005626463.localdomain podman[318218]: 2026-02-23 10:00:11.568485743 +0000 UTC m=+0.171192068 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:00:11 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840811.5695] manager: (tapa44c3247-d9): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Feb 23 10:00:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:11Z|00303|binding|INFO|Claiming lport a44c3247-d9d2-43bf-89f0-20a41979b22d for this chassis.
Feb 23 10:00:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:11Z|00304|binding|INFO|a44c3247-d9d2-43bf-89f0-20a41979b22d: Claiming unknown
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain systemd-udevd[318272]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:00:11 np0005626463.localdomain podman[318217]: 2026-02-23 10:00:11.582574995 +0000 UTC m=+0.185259129 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.581 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fc70b01-d165-4d08-954b-a4aac704fc26, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a44c3247-d9d2-43bf-89f0-20a41979b22d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.586 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a44c3247-d9d2-43bf-89f0-20a41979b22d in datapath 9cf7565c-6c34-4032-874f-5045764c2d40 bound to our chassis
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.592 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9cf7565c-6c34-4032-874f-5045764c2d40 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:00:11 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:11.594 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4895ad-5821-47a7-9667-4a531ddc0105]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:00:11 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:00:11 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:11Z|00305|binding|INFO|Setting lport a44c3247-d9d2-43bf-89f0-20a41979b22d ovn-installed in OVS
Feb 23 10:00:11 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:11Z|00306|binding|INFO|Setting lport a44c3247-d9d2-43bf-89f0-20a41979b22d up in Southbound
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.621 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tapa44c3247-d9: No such device
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e173 do_prune osdmap full prune enabled
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.669 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: osdmap e173: 6 total, 6 up, 6 in
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: pgmap v363: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 82 KiB/s rd, 35 MiB/s wr, 130 op/s
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 e174: 6 total, 6 up, 6 in
Feb 23 10:00:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:11.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e47: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:00:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 do_prune osdmap full prune enabled
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e175 e175: 6 total, 6 up, 6 in
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in
Feb 23 10:00:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:12Z|00307|binding|INFO|Removing iface tapa44c3247-d9 ovn-installed in OVS
Feb 23 10:00:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:12Z|00308|binding|INFO|Removing lport a44c3247-d9d2-43bf-89f0-20a41979b22d ovn-installed in OVS
Feb 23 10:00:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:12.386 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:12.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:12.384 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ecdcc65f-a986-4b6d-a221-9e63ddf8c10a with type ""
Feb 23 10:00:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:12.396 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fc70b01-d165-4d08-954b-a4aac704fc26, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=a44c3247-d9d2-43bf-89f0-20a41979b22d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:12.402 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a44c3247-d9d2-43bf-89f0-20a41979b22d in datapath 9cf7565c-6c34-4032-874f-5045764c2d40 unbound from our chassis
Feb 23 10:00:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:12.406 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9cf7565c-6c34-4032-874f-5045764c2d40 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:00:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:12.407 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d10e106e-d21a-40f8-b8fb-1670a2dac261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:00:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:12Z|00309|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:00:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:12.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:12 np0005626463.localdomain podman[318343]: 
Feb 23 10:00:12 np0005626463.localdomain podman[318343]: 2026-02-23 10:00:12.708439274 +0000 UTC m=+0.089403299 container create 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: mgrmap e47: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: osdmap e174: 6 total, 6 up, 6 in
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: osdmap e175: 6 total, 6 up, 6 in
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3700784966' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:12 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3700784966' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:12 np0005626463.localdomain podman[318343]: 2026-02-23 10:00:12.661946813 +0000 UTC m=+0.042910878 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:00:12 np0005626463.localdomain systemd[1]: Started libpod-conmon-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope.
Feb 23 10:00:12 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:00:12 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a31478153c0cdc9bae681bf35ede0b8adfe37bddc02d2a887055da94332473a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:00:12 np0005626463.localdomain podman[318343]: 2026-02-23 10:00:12.799378339 +0000 UTC m=+0.180342364 container init 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 23 10:00:12 np0005626463.localdomain podman[318343]: 2026-02-23 10:00:12.808661031 +0000 UTC m=+0.189625046 container start 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 10:00:12 np0005626463.localdomain dnsmasq[318360]: started, version 2.85 cachesize 150
Feb 23 10:00:12 np0005626463.localdomain dnsmasq[318360]: DNS service limited to local subnets
Feb 23 10:00:12 np0005626463.localdomain dnsmasq[318360]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:00:12 np0005626463.localdomain dnsmasq[318360]: warning: no upstream servers configured
Feb 23 10:00:12 np0005626463.localdomain dnsmasq-dhcp[318360]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 10:00:12 np0005626463.localdomain dnsmasq[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/addn_hosts - 0 addresses
Feb 23 10:00:12 np0005626463.localdomain dnsmasq-dhcp[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/host
Feb 23 10:00:12 np0005626463.localdomain dnsmasq-dhcp[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/opts
Feb 23 10:00:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:12.900 265541 INFO neutron.agent.dhcp.agent [None req-bb1cc315-296f-4d7a-b16b-9b377c3be573 - - - - - -] DHCP configuration for ports {'4f81ba9a-180b-410b-a42c-37e692c5652b'} is completed
Feb 23 10:00:13 np0005626463.localdomain dnsmasq[318360]: exiting on receipt of SIGTERM
Feb 23 10:00:13 np0005626463.localdomain podman[318377]: 2026-02-23 10:00:13.026535293 +0000 UTC m=+0.058701314 container kill 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:00:13 np0005626463.localdomain systemd[1]: libpod-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope: Deactivated successfully.
Feb 23 10:00:13 np0005626463.localdomain podman[318390]: 2026-02-23 10:00:13.085050691 +0000 UTC m=+0.046842892 container died 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 10:00:13 np0005626463.localdomain podman[318390]: 2026-02-23 10:00:13.16522897 +0000 UTC m=+0.127021141 container cleanup 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:00:13 np0005626463.localdomain systemd[1]: libpod-conmon-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope: Deactivated successfully.
Feb 23 10:00:13 np0005626463.localdomain podman[318397]: 2026-02-23 10:00:13.223571572 +0000 UTC m=+0.166346476 container remove 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:00:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:13.238 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:13 np0005626463.localdomain kernel: device tapa44c3247-d9 left promiscuous mode
Feb 23 10:00:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:13.257 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:13.282 265541 INFO neutron.agent.dhcp.agent [None req-63f71cf9-e17c-4ac8-8877-bfe3b1ae197b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:13 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:13.283 265541 INFO neutron.agent.dhcp.agent [None req-63f71cf9-e17c-4ac8-8877-bfe3b1ae197b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:00:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:00:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:00:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:00:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:00:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:00:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:13.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-3a31478153c0cdc9bae681bf35ede0b8adfe37bddc02d2a887055da94332473a-merged.mount: Deactivated successfully.
Feb 23 10:00:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2-userdata-shm.mount: Deactivated successfully.
Feb 23 10:00:13 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d9cf7565c\x2d6c34\x2d4032\x2d874f\x2d5045764c2d40.mount: Deactivated successfully.
Feb 23 10:00:13 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/718401093' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:13 np0005626463.localdomain ceph-mon[294160]: pgmap v366: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 108 KiB/s rd, 25 MiB/s wr, 153 op/s
Feb 23 10:00:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:14 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:14.176 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e175 do_prune osdmap full prune enabled
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e176 e176: 6 total, 6 up, 6 in
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "format": "json"}]: dispatch
Feb 23 10:00:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e176 do_prune osdmap full prune enabled
Feb 23 10:00:15 np0005626463.localdomain ceph-mon[294160]: osdmap e176: 6 total, 6 up, 6 in
Feb 23 10:00:15 np0005626463.localdomain ceph-mon[294160]: pgmap v368: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 115 KiB/s rd, 26 MiB/s wr, 163 op/s
Feb 23 10:00:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 e177: 6 total, 6 up, 6 in
Feb 23 10:00:15 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in
Feb 23 10:00:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:00:16 np0005626463.localdomain ceph-mon[294160]: osdmap e177: 6 total, 6 up, 6 in
Feb 23 10:00:16 np0005626463.localdomain systemd[1]: tmp-crun.C2FKdt.mount: Deactivated successfully.
Feb 23 10:00:16 np0005626463.localdomain podman[318420]: 2026-02-23 10:00:16.922395288 +0000 UTC m=+0.095747418 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 23 10:00:16 np0005626463.localdomain podman[318420]: 2026-02-23 10:00:16.937358818 +0000 UTC m=+0.110710998 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 10:00:16 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 do_prune osdmap full prune enabled
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e178 e178: 6 total, 6 up, 6 in
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fcbab7a3-8c51-4c59-a723-d901625bf91c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3461046797' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3461046797' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: osdmap e178: 6 total, 6 up, 6 in
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: pgmap v371: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 94 KiB/s rd, 24 MiB/s wr, 141 op/s
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fcbab7a3-8c51-4c59-a723-d901625bf91c", "format": "json"}]: dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1432835237' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1432835237' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:18.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:00:18 np0005626463.localdomain podman[318441]: 2026-02-23 10:00:18.914793021 +0000 UTC m=+0.086708934 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 10:00:18 np0005626463.localdomain podman[318441]: 2026-02-23 10:00:18.920997396 +0000 UTC m=+0.092913319 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 10:00:18 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:00:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e178 do_prune osdmap full prune enabled
Feb 23 10:00:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e179 e179: 6 total, 6 up, 6 in
Feb 23 10:00:19 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in
Feb 23 10:00:19 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:19.947 2 INFO neutron.agent.securitygroups_rpc [req-eeab04f9-52db-422e-9728-ee390003483c req-72446ef4-1fda-4ad3-9026-f16aa7abb40d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group member updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: osdmap e179: 6 total, 6 up, 6 in
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: pgmap v373: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 109 KiB/s rd, 27 MiB/s wr, 163 op/s
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1137513287' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:20 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses
Feb 23 10:00:20 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 10:00:20 np0005626463.localdomain systemd[1]: tmp-crun.KKj4Ya.mount: Deactivated successfully.
Feb 23 10:00:20 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 10:00:20 np0005626463.localdomain podman[318478]: 2026-02-23 10:00:20.253743592 +0000 UTC m=+0.076968428 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e179 do_prune osdmap full prune enabled
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ee02e5c3-d4ec-4cf3-86f8-0c8d48f0b098", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ee02e5c3-d4ec-4cf3-86f8-0c8d48f0b098", "format": "json"}]: dispatch
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1064828933' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 e180: 6 total, 6 up, 6 in
Feb 23 10:00:21 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 do_prune osdmap full prune enabled
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e181 e181: 6 total, 6 up, 6 in
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "fcbab7a3-8c51-4c59-a723-d901625bf91c", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: pgmap v374: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 114 KiB/s rd, 27 MiB/s wr, 175 op/s
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: osdmap e180: 6 total, 6 up, 6 in
Feb 23 10:00:22 np0005626463.localdomain ceph-mon[294160]: osdmap e181: 6 total, 6 up, 6 in
Feb 23 10:00:22 np0005626463.localdomain sudo[318500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:00:22 np0005626463.localdomain sudo[318500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:00:22 np0005626463.localdomain sudo[318500]: pam_unix(sudo:session): session closed for user root
Feb 23 10:00:23 np0005626463.localdomain sudo[318518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:00:23 np0005626463.localdomain sudo[318518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.084 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.085 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.085 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e181 do_prune osdmap full prune enabled
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/869925231' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/869925231' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e182 e182: 6 total, 6 up, 6 in
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.481 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.484 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.503 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.504 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.504 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.505 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:23.509 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:23 np0005626463.localdomain sudo[318518]: pam_unix(sudo:session): session closed for user root
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain sudo[318567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:00:24 np0005626463.localdomain sudo[318567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:00:24 np0005626463.localdomain sudo[318567]: pam_unix(sudo:session): session closed for user root
Feb 23 10:00:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:24.039 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:00:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:24.066 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:00:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:24.067 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:00:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:24.068 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:24.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e182 do_prune osdmap full prune enabled
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: pgmap v377: 177 pgs: 177 active+clean; 640 MiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 124 KiB/s rd, 11 MiB/s wr, 191 op/s
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: osdmap e182: 6 total, 6 up, 6 in
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e183 e183: 6 total, 6 up, 6 in
Feb 23 10:00:24 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in
Feb 23 10:00:24 np0005626463.localdomain sshd[318585]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:00:25 np0005626463.localdomain sshd[318585]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e183 do_prune osdmap full prune enabled
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "format": "json"}]: dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fcbab7a3-8c51-4c59-a723-d901625bf91c", "format": "json"}]: dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fcbab7a3-8c51-4c59-a723-d901625bf91c", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: osdmap e183: 6 total, 6 up, 6 in
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2946001217' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3409382469' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e184 e184: 6 total, 6 up, 6 in
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:26Z|00310|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:00:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:26.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: pgmap v380: 177 pgs: 177 active+clean; 640 MiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 134 KiB/s rd, 6.0 MiB/s wr, 202 op/s
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: osdmap e184: 6 total, 6 up, 6 in
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e184 do_prune osdmap full prune enabled
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 e185: 6 total, 6 up, 6 in
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 0 addresses
Feb 23 10:00:27 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host
Feb 23 10:00:27 np0005626463.localdomain dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts
Feb 23 10:00:27 np0005626463.localdomain podman[318604]: 2026-02-23 10:00:27.146414731 +0000 UTC m=+0.065735325 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:27Z|00311|binding|INFO|Releasing lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 from this chassis (sb_readonly=0)
Feb 23 10:00:27 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:27Z|00312|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 down in Southbound
Feb 23 10:00:27 np0005626463.localdomain kernel: device tapd5a42e1b-50 left promiscuous mode
Feb 23 10:00:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:27.427 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:27 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:27.433 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0421515e6bb54dea8db3ed218999e195', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cbbaed5-c16c-4b6f-96d8-1ef1b1b430f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=d5a42e1b-5089-41c4-9d02-d28b44b515d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:27 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:27.435 163572 INFO neutron.agent.ovn.metadata.agent [-] Port d5a42e1b-5089-41c4-9d02-d28b44b515d2 in datapath ff7aa220-5765-44c6-9121-cfbd718241c5 unbound from our chassis
Feb 23 10:00:27 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:27.438 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff7aa220-5765-44c6-9121-cfbd718241c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:00:27 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:27.439 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e96f8653-33e4-4200-a548-a6093f712819]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:00:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:27.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: osdmap e185: 6 total, 6 up, 6 in
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ca4f151e-2925-40b3-9d6e-b36295484ff5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3846692408' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3846692408' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ca4f151e-2925-40b3-9d6e-b36295484ff5", "format": "json"}]: dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "40524a29-3861-4339-95ff-304a6d8eab80", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "40524a29-3861-4339-95ff-304a6d8eab80", "format": "json"}]: dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: pgmap v383: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 229 KiB/s rd, 23 KiB/s wr, 330 op/s
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1851345063' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1851345063' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "snap_name": "89eb0a17-32bc-40f6-a334-3962941ec616", "format": "json"}]: dispatch
Feb 23 10:00:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2503795885' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1243287233' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:00:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1372422225' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.617 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.697 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.698 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.909 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.911 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11285MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.912 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.912 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:00:28 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:28Z|00313|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:00:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:28.999 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.034 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.054 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:00:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3101926543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.544 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.551 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:00:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1372422225' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:29 np0005626463.localdomain ceph-mon[294160]: pgmap v384: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 155 KiB/s rd, 16 KiB/s wr, 222 op/s
Feb 23 10:00:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3101926543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.607 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.610 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:00:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:29.611 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:00:29 np0005626463.localdomain dnsmasq[316315]: exiting on receipt of SIGTERM
Feb 23 10:00:29 np0005626463.localdomain podman[318687]: 2026-02-23 10:00:29.822715263 +0000 UTC m=+0.063255608 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:00:29 np0005626463.localdomain systemd[1]: libpod-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope: Deactivated successfully.
Feb 23 10:00:29 np0005626463.localdomain podman[318700]: 2026-02-23 10:00:29.892601768 +0000 UTC m=+0.053429189 container died e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 10:00:29 np0005626463.localdomain systemd[1]: tmp-crun.y0e47k.mount: Deactivated successfully.
Feb 23 10:00:29 np0005626463.localdomain podman[318700]: 2026-02-23 10:00:29.949089332 +0000 UTC m=+0.109916723 container cleanup e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:00:29 np0005626463.localdomain systemd[1]: libpod-conmon-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope: Deactivated successfully.
Feb 23 10:00:29 np0005626463.localdomain podman[318702]: 2026-02-23 10:00:29.968862313 +0000 UTC m=+0.117901644 container remove e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:00:30 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:30.131 265541 INFO neutron.agent.dhcp.agent [None req-eee5d287-8dd8-4baa-a93d-7c177f39d304 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:30 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:30.512 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:30.617 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:30.618 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:30.618 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3812846647' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0a4908f0bc9328306b516915e2de85425e40798a526663541b1416ff04dc528a-merged.mount: Deactivated successfully.
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf-userdata-shm.mount: Deactivated successfully.
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dff7aa220\x2d5765\x2d44c6\x2d9121\x2dcfbd718241c5.mount: Deactivated successfully.
Feb 23 10:00:30 np0005626463.localdomain podman[318730]: 2026-02-23 10:00:30.933818198 +0000 UTC m=+0.093856168 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:00:30 np0005626463.localdomain podman[318730]: 2026-02-23 10:00:30.969243361 +0000 UTC m=+0.129281301 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:00:30 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:00:30 np0005626463.localdomain podman[318729]: 2026-02-23 10:00:30.986314677 +0000 UTC m=+0.148815544 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1770267347, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Feb 23 10:00:31 np0005626463.localdomain podman[318729]: 2026-02-23 10:00:31.00231509 +0000 UTC m=+0.164816027 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=)
Feb 23 10:00:31 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:31.010 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:31 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 do_prune osdmap full prune enabled
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ca4f151e-2925-40b3-9d6e-b36295484ff5", "format": "json"}]: dispatch
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "40524a29-3861-4339-95ff-304a6d8eab80", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ca4f151e-2925-40b3-9d6e-b36295484ff5", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: pgmap v385: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 168 KiB/s rd, 30 KiB/s wr, 242 op/s
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 e186: 6 total, 6 up, 6 in
Feb 23 10:00:31 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in
Feb 23 10:00:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 do_prune osdmap full prune enabled
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e187 e187: 6 total, 6 up, 6 in
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in
Feb 23 10:00:32 np0005626463.localdomain dnsmasq[317885]: exiting on receipt of SIGTERM
Feb 23 10:00:32 np0005626463.localdomain podman[318787]: 2026-02-23 10:00:32.275615939 +0000 UTC m=+0.069966628 container kill 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216)
Feb 23 10:00:32 np0005626463.localdomain systemd[1]: libpod-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope: Deactivated successfully.
Feb 23 10:00:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:32.289 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 034ef5b0-24bc-4eeb-b7eb-fa73747ebcf1 with type ""
Feb 23 10:00:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:32Z|00314|binding|INFO|Removing iface tapcf62e779-21 ovn-installed in OVS
Feb 23 10:00:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:32.291 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5e40f037-bddd-4e41-9358-072288273862', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e40f037-bddd-4e41-9358-072288273862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd636fd1-c9ff-44d0-b6d5-3a4c5f8e69de, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=cf62e779-21c9-44d1-992a-8d67e75ee9a4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:32Z|00315|binding|INFO|Removing lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 ovn-installed in OVS
Feb 23 10:00:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:32.301 163572 INFO neutron.agent.ovn.metadata.agent [-] Port cf62e779-21c9-44d1-992a-8d67e75ee9a4 in datapath 5e40f037-bddd-4e41-9358-072288273862 unbound from our chassis
Feb 23 10:00:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:32.303 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e40f037-bddd-4e41-9358-072288273862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:00:32 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:32.306 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[057df0c0-dd81-4f4e-8260-0ad6c8469059]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:00:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:32.333 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:32 np0005626463.localdomain podman[318802]: 2026-02-23 10:00:32.365514644 +0000 UTC m=+0.064824068 container died 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 10:00:32 np0005626463.localdomain systemd[1]: tmp-crun.1lVIK3.mount: Deactivated successfully.
Feb 23 10:00:32 np0005626463.localdomain podman[318802]: 2026-02-23 10:00:32.41060603 +0000 UTC m=+0.109915414 container cleanup 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:00:32 np0005626463.localdomain systemd[1]: libpod-conmon-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope: Deactivated successfully.
Feb 23 10:00:32 np0005626463.localdomain podman[318803]: 2026-02-23 10:00:32.454100045 +0000 UTC m=+0.144869041 container remove 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 10:00:32 np0005626463.localdomain kernel: device tapcf62e779-21 left promiscuous mode
Feb 23 10:00:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:32.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:32.483 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:32.504 265541 INFO neutron.agent.dhcp.agent [None req-81cab5d0-47b8-4830-8d57-c82a39e48f71 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:32.505 265541 INFO neutron.agent.dhcp.agent [None req-81cab5d0-47b8-4830-8d57-c82a39e48f71 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: osdmap e186: 6 total, 6 up, 6 in
Feb 23 10:00:32 np0005626463.localdomain ceph-mon[294160]: osdmap e187: 6 total, 6 up, 6 in
Feb 23 10:00:32 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:32Z|00316|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:00:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:32.687 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-db56fcd2726acc1983ef9844d0380fb1390c65eda950b0c88ea7f886d5e3e2ce-merged.mount: Deactivated successfully.
Feb 23 10:00:33 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a-userdata-shm.mount: Deactivated successfully.
Feb 23 10:00:33 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d5e40f037\x2dbddd\x2d4e41\x2d9358\x2d072288273862.mount: Deactivated successfully.
Feb 23 10:00:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:33.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e187 do_prune osdmap full prune enabled
Feb 23 10:00:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 e188: 6 total, 6 up, 6 in
Feb 23 10:00:33 np0005626463.localdomain ceph-mon[294160]: pgmap v388: 177 pgs: 177 active+clean; 192 MiB data, 936 MiB used, 41 GiB / 42 GiB avail; 132 KiB/s rd, 22 KiB/s wr, 179 op/s
Feb 23 10:00:33 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in
Feb 23 10:00:34 np0005626463.localdomain ceph-mon[294160]: osdmap e188: 6 total, 6 up, 6 in
Feb 23 10:00:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "40524a29-3861-4339-95ff-304a6d8eab80", "format": "json"}]: dispatch
Feb 23 10:00:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "40524a29-3861-4339-95ff-304a6d8eab80", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:35.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:00:35 np0005626463.localdomain ceph-mon[294160]: pgmap v390: 177 pgs: 177 active+clean; 192 MiB data, 936 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 21 KiB/s wr, 72 op/s
Feb 23 10:00:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:38 np0005626463.localdomain ceph-mon[294160]: pgmap v391: 177 pgs: 177 active+clean; 223 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 2.7 MiB/s wr, 143 op/s
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.595 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.598 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:38.645 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:00:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:00:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:00:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:00:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:00:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18817 "" "Go-http-client/1.1"
Feb 23 10:00:40 np0005626463.localdomain ceph-mon[294160]: pgmap v392: 177 pgs: 177 active+clean; 223 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 2.1 MiB/s wr, 113 op/s
Feb 23 10:00:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 do_prune osdmap full prune enabled
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "snap_name": "89eb0a17-32bc-40f6-a334-3962941ec616_e9d2bf1c-54a0-4805-9a00-108fd2939de1", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "snap_name": "89eb0a17-32bc-40f6-a334-3962941ec616", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "99b5759b-6bd3-4969-8e77-f18612d84802", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "99b5759b-6bd3-4969-8e77-f18612d84802", "format": "json"}]: dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: pgmap v393: 177 pgs: 177 active+clean; 238 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.4 MiB/s wr, 101 op/s
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 e189: 6 total, 6 up, 6 in
Feb 23 10:00:41 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in
Feb 23 10:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:00:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:00:41 np0005626463.localdomain podman[318835]: 2026-02-23 10:00:41.923893621 +0000 UTC m=+0.094441267 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 23 10:00:41 np0005626463.localdomain podman[318836]: 2026-02-23 10:00:41.891945157 +0000 UTC m=+0.064206128 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 10:00:41 np0005626463.localdomain podman[318836]: 2026-02-23 10:00:41.974380156 +0000 UTC m=+0.146641107 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 10:00:41 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:00:42 np0005626463.localdomain podman[318835]: 2026-02-23 10:00:42.026391819 +0000 UTC m=+0.196939505 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 23 10:00:42 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 do_prune osdmap full prune enabled
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e190 e190: 6 total, 6 up, 6 in
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: osdmap e189: 6 total, 6 up, 6 in
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: osdmap e190: 6 total, 6 up, 6 in
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:00:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:00:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:00:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:00:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:00:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:00:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:43 np0005626463.localdomain ceph-mon[294160]: pgmap v396: 177 pgs: 177 active+clean; 238 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 131 op/s
Feb 23 10:00:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:43.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:43.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e190 do_prune osdmap full prune enabled
Feb 23 10:00:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e191 e191: 6 total, 6 up, 6 in
Feb 23 10:00:44 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in
Feb 23 10:00:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:45.224 2 INFO neutron.agent.securitygroups_rpc [None req-07956afb-903c-4340-9a87-60ef781dc496 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ee848905-323b-4447-944b-9bd735c2e380", "format": "json"}]: dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ee848905-323b-4447-944b-9bd735c2e380", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: osdmap e191: 6 total, 6 up, 6 in
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3423787226' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3423787226' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "99b5759b-6bd3-4969-8e77-f18612d84802", "format": "json"}]: dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "99b5759b-6bd3-4969-8e77-f18612d84802", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: pgmap v398: 177 pgs: 177 active+clean; 238 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 889 KiB/s wr, 44 op/s
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2882897457' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2882897457' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e191 do_prune osdmap full prune enabled
Feb 23 10:00:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 e192: 6 total, 6 up, 6 in
Feb 23 10:00:46 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in
Feb 23 10:00:46 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:46.612 2 INFO neutron.agent.securitygroups_rpc [None req-8a14fab2-7a8a-4c43-8296-30b056f5b06d 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:46 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:46.805 2 INFO neutron.agent.securitygroups_rpc [None req-8a14fab2-7a8a-4c43-8296-30b056f5b06d 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:47.061 2 INFO neutron.agent.securitygroups_rpc [None req-9c6cef5b-9665-4a72-8732-114020aa83fb 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:47 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:47.377 2 INFO neutron.agent.securitygroups_rpc [None req-0ce85d6b-73e9-4380-afd1-5f90a1fc10e8 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:47 np0005626463.localdomain ceph-mon[294160]: osdmap e192: 6 total, 6 up, 6 in
Feb 23 10:00:47 np0005626463.localdomain ceph-mon[294160]: pgmap v400: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 138 KiB/s rd, 22 KiB/s wr, 193 op/s
Feb 23 10:00:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:00:47 np0005626463.localdomain systemd[1]: tmp-crun.LDI31E.mount: Deactivated successfully.
Feb 23 10:00:47 np0005626463.localdomain podman[318883]: 2026-02-23 10:00:47.990808116 +0000 UTC m=+0.054979247 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 10:00:48 np0005626463.localdomain podman[318883]: 2026-02-23 10:00:48.028069647 +0000 UTC m=+0.092240778 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:00:48 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 do_prune osdmap full prune enabled
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e193 e193: 6 total, 6 up, 6 in
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in
Feb 23 10:00:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:00:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:00:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ee02e5c3-d4ec-4cf3-86f8-0c8d48f0b098", "format": "json"}]: dispatch
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: osdmap e193: 6 total, 6 up, 6 in
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.648 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:48.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:49.480 2 INFO neutron.agent.securitygroups_rpc [None req-81c2d0d8-1692-4b29-b62c-eb471980e5c6 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e193 do_prune osdmap full prune enabled
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ee02e5c3-d4ec-4cf3-86f8-0c8d48f0b098", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "066cedc7-9ad0-458f-a1e2-21940628c140", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "066cedc7-9ad0-458f-a1e2-21940628c140", "format": "json"}]: dispatch
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: pgmap v402: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 116 KiB/s rd, 19 KiB/s wr, 164 op/s
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e194 e194: 6 total, 6 up, 6 in
Feb 23 10:00:49 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in
Feb 23 10:00:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:00:49 np0005626463.localdomain podman[318902]: 2026-02-23 10:00:49.889233778 +0000 UTC m=+0.071288060 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:00:49 np0005626463.localdomain podman[318902]: 2026-02-23 10:00:49.923333009 +0000 UTC m=+0.105387221 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 10:00:49 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:00:49 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:49.968 2 INFO neutron.agent.securitygroups_rpc [None req-502b98ac-7cd1-4314-8dc8-56f7a95bf1c0 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e194 do_prune osdmap full prune enabled
Feb 23 10:00:50 np0005626463.localdomain ceph-mon[294160]: osdmap e194: 6 total, 6 up, 6 in
Feb 23 10:00:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e195 e195: 6 total, 6 up, 6 in
Feb 23 10:00:50 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e195 do_prune osdmap full prune enabled
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 e196: 6 total, 6 up, 6 in
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: osdmap e195: 6 total, 6 up, 6 in
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: pgmap v405: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 26 KiB/s wr, 167 op/s
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3338744072' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3338744072' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 do_prune osdmap full prune enabled
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e197 e197: 6 total, 6 up, 6 in
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "066cedc7-9ad0-458f-a1e2-21940628c140", "format": "json"}]: dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "066cedc7-9ad0-458f-a1e2-21940628c140", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: osdmap e196: 6 total, 6 up, 6 in
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: osdmap e197: 6 total, 6 up, 6 in
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2943719624' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:52 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2943719624' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:53.060 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:53.061 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:53.062 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:53.727 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e197 do_prune osdmap full prune enabled
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: pgmap v408: 177 pgs: 177 active+clean; 192 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 181 KiB/s rd, 33 KiB/s wr, 252 op/s
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e198 e198: 6 total, 6 up, 6 in
Feb 23 10:00:53 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in
Feb 23 10:00:54 np0005626463.localdomain ceph-mon[294160]: osdmap e198: 6 total, 6 up, 6 in
Feb 23 10:00:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:55.203 2 INFO neutron.agent.securitygroups_rpc [None req-572fb810-ab54-44db-ac0d-47376cf59f66 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:55.534 2 INFO neutron.agent.securitygroups_rpc [None req-3cc00f74-b029-4176-993c-a46e50d01263 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:55 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:55.806 2 INFO neutron.agent.securitygroups_rpc [None req-252a7af6-1118-4c58-9e08-ff23cd8dfa0e 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e198 do_prune osdmap full prune enabled
Feb 23 10:00:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "0999e527-f1ec-432e-b9da-01ea97a863f0", "format": "json"}]: dispatch
Feb 23 10:00:55 np0005626463.localdomain ceph-mon[294160]: pgmap v410: 177 pgs: 177 active+clean; 192 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 158 KiB/s rd, 29 KiB/s wr, 220 op/s
Feb 23 10:00:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 e199: 6 total, 6 up, 6 in
Feb 23 10:00:55 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.144 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.145 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6c2bf04-0278-461c-bb91-654fe331cfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.145685', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bcc7ad6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'b9d1c0e4136abdc77622d645b539f0f36fa30239089fab5d7cd7bf0ff34f84ff'}]}, 'timestamp': '2026-02-23 10:00:56.151037', '_unique_id': '890494c2c06e4b32a566b7bf0d8391d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2919e70-bc51-47e6-abb8-e9a2c13a2df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.154106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd19b42-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '64e32666570d1a70b77e755d421f11f235990abc064b62a393f5b33b6dd1a20a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.154106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd1aeca-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'e6f7596c86973b332d4bcf9fe678043ec7dd4a6120e4a6f838154d65a7fcd1ac'}]}, 'timestamp': '2026-02-23 10:00:56.185045', '_unique_id': 'ba5a1f0d936d4f0e8d1bc870b7a055c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '574409cf-3756-48f4-8f89-37fb5ffcd899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.187517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd3cf5c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'ff81cdebb056c454e1d81b627dbf357ca17a6933880a180167f02c48ac2f0269'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.187517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd3e21c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '660aab5f87c6981cb852b808a5588ca349c074894f74d2cb3a0d0c6ccc88654e'}]}, 'timestamp': '2026-02-23 10:00:56.199429', '_unique_id': '5cc9d0188a52455082c2443381e8db29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b25202b-02fb-466b-8a7a-31eaae7ae990', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.201820', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd4522e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '7b951179ef1c2c56b59893a924acc75947cd295a8815aff05dd5660b151a61d2'}]}, 'timestamp': '2026-02-23 10:00:56.202325', '_unique_id': '5cf1d60bf175421691decc00b2cfa6d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca9164bb-9f13-4760-a464-96e566605d3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.204461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd4b890-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'c27b44e6edd01925495145055a8961ea348f9ebe51f6b98d8888edb6adc28b0b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.204461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd4ca38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '240cbc0a4b5f4ddd569d407e5f0d2b27d42917e1e80b36cfc033804e2e4b6f5c'}]}, 'timestamp': '2026-02-23 10:00:56.205370', '_unique_id': '07f1b551489b41e59060a2621f944d4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '510ebb4d-1488-4d16-bef4-91cd61d30e25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.207528', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd52f78-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '98832c87f1114055f456e289ec98e0dcc5ae1e152f81b61186c26f53da5b95f6'}]}, 'timestamp': '2026-02-23 10:00:56.208022', '_unique_id': '94da94fb8ad74c64abbccda810ed1258'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:56.227 2 INFO neutron.agent.securitygroups_rpc [None req-a3f5d0f4-f2a7-4483-8e6e-6de10e55779f 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d5bb7b6-f39f-42e2-b0ca-9a5406b9a8f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:00:56.210423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8bd7fece-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.415218954, 'message_signature': '2fc18f39bd6f4069fbbf2fda1d81c2e3d770d9d4b2649b1f47980e1eac884863'}]}, 'timestamp': '2026-02-23 10:00:56.226456', '_unique_id': 'd8922b6d7c2d453384ec095fe125e04d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81976945-ac22-4212-b322-aa792ab0eb15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.228944', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd87674-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '252f91ee4f0a0def57a500ef89082db05f4069ac8bd0aa789e41f374b6a5b4a5'}]}, 'timestamp': '2026-02-23 10:00:56.229479', '_unique_id': 'e4aaf7c904db44649624c575bcc4e2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19ecc10f-a046-44b8-84b5-be0a2f62189a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.231766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd8e41a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '2c48bd37bcef21e03711c8f528ed299645999707f331ae822649f5d0f41d61f0'}]}, 'timestamp': '2026-02-23 10:00:56.232320', '_unique_id': 'bf29c8ca251b42c8a631de29f98ad0f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebce52c2-e71d-4f9a-9f69-a4bb4e75c42f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.234608', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd9517a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'df229a6d991370a63362e21cebf9405fcccd314cc4ef71ce61ea41c3984c67e7'}]}, 'timestamp': '2026-02-23 10:00:56.235111', '_unique_id': '6c6ea3ee5add44dfbd1501632daf7fae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b85a616-f2f2-4569-88f0-2e9d04459e10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.237642', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd9c9c0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'd23444cd6267e5c01945efea793699e73348d25750e5360045c186ce091cf162'}]}, 'timestamp': '2026-02-23 10:00:56.238270', '_unique_id': '99d1c1e3a641443da5a805dcc10fd83c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9925f058-5d7d-4d2b-a038-a9e5aefa5e38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.240541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bda3914-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'e363dca22e4214ef188a865592242a6d62c1e956a9421ad99e3f617c7ed7964a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.240541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bda4c38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'f32bb874f557b20fce9df58d14430c10d3f4d656db2bf6d73a93823344119ef4'}]}, 'timestamp': '2026-02-23 10:00:56.241466', '_unique_id': '1356d5e737e544f68a07527cfffef623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cfcbf7d-0458-4a27-969a-51c3ba158713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.243677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdab4a2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'fb4666d732d403a4e93e6b2ac0c906494e40042db1ffd38e8c376d9e1b32d060'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.243677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdac564-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '0e4392b35193a69e77fe57787014433f6169c12a6739f880b545e5f74387e9bc'}]}, 'timestamp': '2026-02-23 10:00:56.244603', '_unique_id': '4e7bd622b2704465bec62e0ad46c69a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '234e5f17-4e0b-4cd2-9857-12b883fad38e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.246820', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bdb304e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '11e9acbbf9d78df5f34512e50cdfc64c88b40265f939c1c0949935201d4bc8ec'}]}, 'timestamp': '2026-02-23 10:00:56.247333', '_unique_id': '5ae68f0bd3ee4319a2821e1fdb57a53b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0268a5b-f6bc-4abb-87be-5efa26b7719d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.249497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdb96ba-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'fd7b48ef33cd27821bb1f04c6859d50957645e733c6454646bbbae8140f2ecf2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.249497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdbb9ec-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '42a44054d1f2f31ca216287ae664c981d68b85a1b2713ca32fe7598a581d88f8'}]}, 'timestamp': '2026-02-23 10:00:56.250987', '_unique_id': '44c832d03cde41519216114953443909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8c915f6-271f-4f62-b25d-fe1878574267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.254354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdc5cbc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'b58f6c6dad5b5f0b81e96d71abf7c1b26ef39a7a8e5c330f6c64cc0dccb23638'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.254354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdc7314-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '6af67ddb8142bc9e5d9f6087668dece677d81b662a5279e0cb343a73b5abb717'}]}, 'timestamp': '2026-02-23 10:00:56.255570', '_unique_id': '4dfd55337d7d43d2801a66c1efb9e4e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '928fcb0f-5d30-4efc-a22c-50532174a63f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.257860', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdcdfe8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '5f484b8026f8e365cc860933081fd0f68a4c864d2e0eccea5876cf54d5a063f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.257860', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdcf05a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'f6c383d15d83eacf8ee1fb1232909ea10ab7ebf6fe121c59038f5cc0288a0af4'}]}, 'timestamp': '2026-02-23 10:00:56.258773', '_unique_id': '773c8bec359746059b506a73e3994d72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79012cf4-6c87-46ba-abe7-e133783cc539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.261726', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bdd7282-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'ef10eae0eecf80ea3fb5fdf78daee43ccb1da7dddc6f862aae8e5554e4d07ea4'}]}, 'timestamp': '2026-02-23 10:00:56.262056', '_unique_id': 'e7ec4a0b5972457c94069e9dd2dd8f15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b77ba3e-3eaa-4f4d-9767-ae43df8c186b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.263406', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bddb328-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '3bfb41ab7c6f7c250b7a3d127e379bdbc1631b9f3afbbf0045d55b6dd2f28ad6'}]}, 'timestamp': '2026-02-23 10:00:56.263700', '_unique_id': '98ddd7c9fb4c441697ab60042b043545'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 14880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2503d5c-75c0-414b-a7f3-fe6b2f0fdd30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14880000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:00:56.265110', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8bddf5c2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.415218954, 'message_signature': 'eaafd2f6f576736364ec8a7a8fce765efe1f135b96de1e17b837f9809aee04d6'}]}, 'timestamp': '2026-02-23 10:00:56.265398', '_unique_id': 'a3381aa984f94dfdb490bb0e2031f8ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17c7d105-fbdc-4cd5-b1dd-ad3a446b45bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.266740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bde35aa-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'cb732127c5602dfbee01e679ca3b54e1211f3f01a9bec6ab287f532953430c84'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.266740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bde3ffa-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'd6f4928ec8eadbb5745bbcbecb368876c928a73e5373eb5496a322f41161e8c2'}]}, 'timestamp': '2026-02-23 10:00:56.267287', '_unique_id': '2d24a51e866a4f10962b05c2023f3bae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:00:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:00:56 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:56.569 2 INFO neutron.agent.securitygroups_rpc [None req-84c168ac-c1cc-4ad3-a429-c07c41b0ce10 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:56 np0005626463.localdomain ceph-mon[294160]: osdmap e199: 6 total, 6 up, 6 in
Feb 23 10:00:57 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:57.207 2 INFO neutron.agent.securitygroups_rpc [None req-076cf072-6b56-42a9-a049-cb8eb1290757 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:00:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 do_prune osdmap full prune enabled
Feb 23 10:00:57 np0005626463.localdomain ceph-mon[294160]: pgmap v412: 177 pgs: 177 active+clean; 192 MiB data, 962 MiB used, 41 GiB / 42 GiB avail; 4.0 MiB/s rd, 20 KiB/s wr, 311 op/s
Feb 23 10:00:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e200 e200: 6 total, 6 up, 6 in
Feb 23 10:00:57 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.728 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.759 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:58.760 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e200 do_prune osdmap full prune enabled
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: osdmap e200: 6 total, 6 up, 6 in
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "0999e527-f1ec-432e-b9da-01ea97a863f0_a37ec8d1-9603-4bc2-b3c2-056d8fe1be15", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "0999e527-f1ec-432e-b9da-01ea97a863f0", "force": true, "format": "json"}]: dispatch
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e201 e201: 6 total, 6 up, 6 in
Feb 23 10:00:58 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in
Feb 23 10:00:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:00:59.127 265541 INFO neutron.agent.linux.ip_lib [None req-87a63f7f-d975-45c2-84d9-883b37d73f3d - - - - - -] Device tapca3573cf-58 cannot be used as it has no MAC address
Feb 23 10:00:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:59.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:59 np0005626463.localdomain kernel: device tapca3573cf-58 entered promiscuous mode
Feb 23 10:00:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:59.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:59Z|00317|binding|INFO|Claiming lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e for this chassis.
Feb 23 10:00:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:59Z|00318|binding|INFO|ca3573cf-58bc-4b24-8e6e-7f86bcaa638e: Claiming unknown
Feb 23 10:00:59 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840859.1761] manager: (tapca3573cf-58): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Feb 23 10:00:59 np0005626463.localdomain systemd-udevd[318932]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:00:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:59.182 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90343b3c0ce240adab2c21e5c92b6952', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b465d1-55ea-4c9a-b8cb-0871f64f66d5, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=ca3573cf-58bc-4b24-8e6e-7f86bcaa638e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:00:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:59.184 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ca3573cf-58bc-4b24-8e6e-7f86bcaa638e in datapath 9215e91e-1f4e-4608-9372-53243278a03d bound to our chassis
Feb 23 10:00:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:59.186 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9215e91e-1f4e-4608-9372-53243278a03d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:00:59 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:00:59.188 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3931a9-9362-47b9-993a-8103855922df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:00:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:59Z|00319|binding|INFO|Setting lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e ovn-installed in OVS
Feb 23 10:00:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:00:59Z|00320|binding|INFO|Setting lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e up in Southbound
Feb 23 10:00:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:59.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:59.259 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:00:59.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:00:59 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:00:59.724 2 INFO neutron.agent.securitygroups_rpc [None req-ecc0bc70-4d61-4d71-86a4-16b51953c44e 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e201 do_prune osdmap full prune enabled
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "format": "json"}]: dispatch
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: osdmap e201: 6 total, 6 up, 6 in
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: pgmap v415: 177 pgs: 177 active+clean; 192 MiB data, 962 MiB used, 41 GiB / 42 GiB avail; 4.0 MiB/s rd, 17 KiB/s wr, 225 op/s
Feb 23 10:00:59 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 e202: 6 total, 6 up, 6 in
Feb 23 10:01:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in
Feb 23 10:01:00 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:00.065 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:01:00 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:00.326 2 INFO neutron.agent.securitygroups_rpc [None req-0f7cdb10-6b15-41bf-967b-b0f718bce643 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:00 np0005626463.localdomain podman[318985]: 
Feb 23 10:01:00 np0005626463.localdomain podman[318985]: 2026-02-23 10:01:00.398006194 +0000 UTC m=+0.101822478 container create 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:01:00 np0005626463.localdomain systemd[1]: Started libpod-conmon-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope.
Feb 23 10:01:00 np0005626463.localdomain podman[318985]: 2026-02-23 10:01:00.345435673 +0000 UTC m=+0.049251957 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:01:00 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:01:00 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716b341786c1c3263e23387b3074c8217c19b5220ae1993515ce916482e724d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:01:00 np0005626463.localdomain podman[318985]: 2026-02-23 10:01:00.492666438 +0000 UTC m=+0.196482692 container init 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:01:00 np0005626463.localdomain podman[318985]: 2026-02-23 10:01:00.506925365 +0000 UTC m=+0.210741649 container start 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: started, version 2.85 cachesize 150
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: DNS service limited to local subnets
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: warning: no upstream servers configured
Feb 23 10:01:00 np0005626463.localdomain dnsmasq-dhcp[319003]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/addn_hosts - 0 addresses
Feb 23 10:01:00 np0005626463.localdomain dnsmasq-dhcp[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/host
Feb 23 10:01:00 np0005626463.localdomain dnsmasq-dhcp[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/opts
Feb 23 10:01:00 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:01:00.687 265541 INFO neutron.agent.dhcp.agent [None req-781ebf1e-ee3c-40a5-a3fe-2998c03fa027 - - - - - -] DHCP configuration for ports {'28a87a71-0275-4bed-a2c7-c89f13d53c0a'} is completed
Feb 23 10:01:00 np0005626463.localdomain dnsmasq[319003]: exiting on receipt of SIGTERM
Feb 23 10:01:00 np0005626463.localdomain podman[319021]: 2026-02-23 10:01:00.87356791 +0000 UTC m=+0.062026919 container kill 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:01:00 np0005626463.localdomain systemd[1]: libpod-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope: Deactivated successfully.
Feb 23 10:01:00 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:00.891 2 INFO neutron.agent.securitygroups_rpc [None req-9f100d5f-55ff-4c02-bab1-7dd2c773e5e4 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:00 np0005626463.localdomain podman[319034]: 2026-02-23 10:01:00.939655015 +0000 UTC m=+0.054961016 container died 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 10:01:01 np0005626463.localdomain ceph-mon[294160]: osdmap e202: 6 total, 6 up, 6 in
Feb 23 10:01:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3008549570' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:01 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3008549570' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:01 np0005626463.localdomain podman[319034]: 2026-02-23 10:01:01.028572788 +0000 UTC m=+0.143878739 container cleanup 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: libpod-conmon-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:01:01 np0005626463.localdomain podman[319041]: 2026-02-23 10:01:01.053495051 +0000 UTC m=+0.147744532 container remove 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 10:01:01 np0005626463.localdomain podman[319062]: 2026-02-23 10:01:01.152121328 +0000 UTC m=+0.088409767 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1770267347, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 23 10:01:01 np0005626463.localdomain podman[319062]: 2026-02-23 10:01:01.170444504 +0000 UTC m=+0.106732993 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container)
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain podman[319064]: 2026-02-23 10:01:01.266030626 +0000 UTC m=+0.201205641 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:01:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:01.282 2 INFO neutron.agent.securitygroups_rpc [None req-3d8beedf-5b31-48e0-a2b2-815a1e0fad78 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:01 np0005626463.localdomain podman[319064]: 2026-02-23 10:01:01.302508001 +0000 UTC m=+0.237682996 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: tmp-crun.acftY0.mount: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-716b341786c1c3263e23387b3074c8217c19b5220ae1993515ce916482e724d4-merged.mount: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308-userdata-shm.mount: Deactivated successfully.
Feb 23 10:01:01 np0005626463.localdomain CROND[319111]: (root) CMD (run-parts /etc/cron.hourly)
Feb 23 10:01:01 np0005626463.localdomain run-parts[319115]: (/etc/cron.hourly) starting 0anacron
Feb 23 10:01:01 np0005626463.localdomain run-parts[319122]: (/etc/cron.hourly) finished 0anacron
Feb 23 10:01:01 np0005626463.localdomain CROND[319110]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: pgmap v417: 177 pgs: 177 active+clean; 218 MiB data, 994 MiB used, 41 GiB / 42 GiB avail; 4.1 MiB/s rd, 2.9 MiB/s wr, 372 op/s
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/551893517' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/551893517' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 do_prune osdmap full prune enabled
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 e203: 6 total, 6 up, 6 in
Feb 23 10:01:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in
Feb 23 10:01:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:01:02Z|00321|binding|INFO|Removing iface tapca3573cf-58 ovn-installed in OVS
Feb 23 10:01:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:02.430 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3f25bc68-8a54-46df-a558-fc9b93a655ff with type ""
Feb 23 10:01:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:01:02Z|00322|binding|INFO|Removing lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e ovn-installed in OVS
Feb 23 10:01:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:02.432 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90343b3c0ce240adab2c21e5c92b6952', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b465d1-55ea-4c9a-b8cb-0871f64f66d5, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=ca3573cf-58bc-4b24-8e6e-7f86bcaa638e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:01:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:02.434 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ca3573cf-58bc-4b24-8e6e-7f86bcaa638e in datapath 9215e91e-1f4e-4608-9372-53243278a03d unbound from our chassis
Feb 23 10:01:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:02.436 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9215e91e-1f4e-4608-9372-53243278a03d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:01:02 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:02.441 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[598db6b0-ce86-4f12-a0b0-9a6b0ac7efe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:01:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:02.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:02.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:02 np0005626463.localdomain podman[319169]: 
Feb 23 10:01:02 np0005626463.localdomain podman[319169]: 2026-02-23 10:01:02.643784345 +0000 UTC m=+0.098969800 container create 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:01:02 np0005626463.localdomain podman[319169]: 2026-02-23 10:01:02.598513744 +0000 UTC m=+0.053699249 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:01:02 np0005626463.localdomain systemd[1]: Started libpod-conmon-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope.
Feb 23 10:01:02 np0005626463.localdomain systemd[1]: tmp-crun.HEif4p.mount: Deactivated successfully.
Feb 23 10:01:02 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:01:02 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f08d8ee518831f994af401d4ecaa6867d89605187afe8b2fdadfbff6f089c22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:01:02 np0005626463.localdomain podman[319169]: 2026-02-23 10:01:02.736177527 +0000 UTC m=+0.191362982 container init 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 23 10:01:02 np0005626463.localdomain podman[319169]: 2026-02-23 10:01:02.746835402 +0000 UTC m=+0.202020857 container start 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 10:01:02 np0005626463.localdomain dnsmasq[319187]: started, version 2.85 cachesize 150
Feb 23 10:01:02 np0005626463.localdomain dnsmasq[319187]: DNS service limited to local subnets
Feb 23 10:01:02 np0005626463.localdomain dnsmasq[319187]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:01:02 np0005626463.localdomain dnsmasq[319187]: warning: no upstream servers configured
Feb 23 10:01:02 np0005626463.localdomain dnsmasq-dhcp[319187]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 10:01:02 np0005626463.localdomain dnsmasq-dhcp[319187]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Feb 23 10:01:02 np0005626463.localdomain dnsmasq[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/addn_hosts - 1 addresses
Feb 23 10:01:02 np0005626463.localdomain dnsmasq-dhcp[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/host
Feb 23 10:01:02 np0005626463.localdomain dnsmasq-dhcp[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/opts
Feb 23 10:01:02 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:01:02Z|00323|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:01:02 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:01:02.935 265541 INFO neutron.agent.dhcp.agent [None req-f132ed7e-7992-48a7-a661-9fc0064681a1 - - - - - -] DHCP configuration for ports {'28a87a71-0275-4bed-a2c7-c89f13d53c0a', 'aa36b0a9-2380-4c2e-a8cd-48f5992ac1f5', 'ca3573cf-58bc-4b24-8e6e-7f86bcaa638e'} is completed
Feb 23 10:01:02 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:02.969 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "snap_name": "7b0236fc-1095-4ed0-a7af-e81e5f884e6a", "format": "json"}]: dispatch
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: osdmap e203: 6 total, 6 up, 6 in
Feb 23 10:01:03 np0005626463.localdomain dnsmasq[319187]: exiting on receipt of SIGTERM
Feb 23 10:01:03 np0005626463.localdomain podman[319205]: 2026-02-23 10:01:03.110783802 +0000 UTC m=+0.067841152 container kill 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 10:01:03 np0005626463.localdomain systemd[1]: libpod-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope: Deactivated successfully.
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:01:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:03 np0005626463.localdomain podman[319217]: 2026-02-23 10:01:03.183979321 +0000 UTC m=+0.059564693 container died 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 10:01:03 np0005626463.localdomain podman[319217]: 2026-02-23 10:01:03.228577751 +0000 UTC m=+0.104163063 container cleanup 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:01:03 np0005626463.localdomain systemd[1]: libpod-conmon-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope: Deactivated successfully.
Feb 23 10:01:03 np0005626463.localdomain podman[319219]: 2026-02-23 10:01:03.266417259 +0000 UTC m=+0.133211804 container remove 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:01:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:03.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:03 np0005626463.localdomain kernel: device tapca3573cf-58 left promiscuous mode
Feb 23 10:01:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:03.292 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.312 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:01:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.313 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:01:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.313 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:01:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0f08d8ee518831f994af401d4ecaa6867d89605187afe8b2fdadfbff6f089c22-merged.mount: Deactivated successfully.
Feb 23 10:01:03 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d-userdata-shm.mount: Deactivated successfully.
Feb 23 10:01:03 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d9215e91e\x2d1f4e\x2d4608\x2d9372\x2d53243278a03d.mount: Deactivated successfully.
Feb 23 10:01:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:03.799 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: pgmap v419: 177 pgs: 177 active+clean; 239 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 148 KiB/s rd, 4.0 MiB/s wr, 216 op/s
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:01:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:04 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:04.408 2 INFO neutron.agent.securitygroups_rpc [None req-8dbf60ea-0a4f-4b6f-8f3b-cd15a4e6bbb5 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a6576274-fdc3-4c38-8214-f23c6386f2cf", "format": "json"}]: dispatch
Feb 23 10:01:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3818338506' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3818338506' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:05 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:05.237 2 INFO neutron.agent.securitygroups_rpc [None req-03b9fc8c-f7be-48ab-90b7-4895d3ad0871 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:05 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:05.837 2 INFO neutron.agent.securitygroups_rpc [None req-33f0d874-7a76-4789-bd46-4efe27d23136 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:06 np0005626463.localdomain ceph-mon[294160]: pgmap v420: 177 pgs: 177 active+clean; 239 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 3.4 MiB/s wr, 182 op/s
Feb 23 10:01:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:06.256 2 INFO neutron.agent.securitygroups_rpc [None req-57a0e794-cdb7-43ae-ad96-b51298fae526 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:06 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "snap_name": "7b0236fc-1095-4ed0-a7af-e81e5f884e6a_d3ad30ba-4758-4125-8037-7dc5cb3e89aa", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "snap_name": "7b0236fc-1095-4ed0-a7af-e81e5f884e6a", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 do_prune osdmap full prune enabled
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e204 e204: 6 total, 6 up, 6 in
Feb 23 10:01:07 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "format": "json"}]: dispatch
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: pgmap v421: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 132 KiB/s rd, 2.7 MiB/s wr, 193 op/s
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: osdmap e204: 6 total, 6 up, 6 in
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e204 do_prune osdmap full prune enabled
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 e205: 6 total, 6 up, 6 in
Feb 23 10:01:08 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in
Feb 23 10:01:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:08.802 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "5dbe1f82-6e63-4670-abcb-d97d24ea7f3d", "format": "json"}]: dispatch
Feb 23 10:01:09 np0005626463.localdomain ceph-mon[294160]: osdmap e205: 6 total, 6 up, 6 in
Feb 23 10:01:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:09.251 2 INFO neutron.agent.securitygroups_rpc [None req-8f58242c-3b5b-4ffa-9190-0f405a707baa 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:01:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:01:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:01:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:01:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:01:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18822 "" "Go-http-client/1.1"
Feb 23 10:01:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "format": "json"}]: dispatch
Feb 23 10:01:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "eaea33d2-5529-46ec-832a-9d7cb7e5b233", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:10 np0005626463.localdomain ceph-mon[294160]: pgmap v424: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 14 KiB/s wr, 58 op/s
Feb 23 10:01:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:11 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:11 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "format": "json"}]: dispatch
Feb 23 10:01:12 np0005626463.localdomain ceph-mon[294160]: pgmap v425: 177 pgs: 177 active+clean; 192 MiB data, 955 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 31 KiB/s wr, 53 op/s
Feb 23 10:01:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 do_prune osdmap full prune enabled
Feb 23 10:01:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 e206: 6 total, 6 up, 6 in
Feb 23 10:01:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in
Feb 23 10:01:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:01:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:01:12 np0005626463.localdomain podman[319248]: 2026-02-23 10:01:12.911885412 +0000 UTC m=+0.084302319 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 23 10:01:13 np0005626463.localdomain systemd[1]: tmp-crun.cMfIaG.mount: Deactivated successfully.
Feb 23 10:01:13 np0005626463.localdomain podman[319249]: 2026-02-23 10:01:13.008720153 +0000 UTC m=+0.174133979 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 10:01:13 np0005626463.localdomain podman[319249]: 2026-02-23 10:01:13.020236465 +0000 UTC m=+0.185650261 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:01:13 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:01:13 np0005626463.localdomain podman[319248]: 2026-02-23 10:01:13.033584234 +0000 UTC m=+0.206001161 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller)
Feb 23 10:01:13 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:01:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:01:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:01:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:01:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:01:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:01:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:01:13 np0005626463.localdomain ceph-mon[294160]: osdmap e206: 6 total, 6 up, 6 in
Feb 23 10:01:13 np0005626463.localdomain ceph-mon[294160]: pgmap v427: 177 pgs: 177 active+clean; 192 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 25 KiB/s wr, 5 op/s
Feb 23 10:01:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:13.805 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:16 np0005626463.localdomain ceph-mon[294160]: pgmap v428: 177 pgs: 177 active+clean; 192 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 521 B/s rd, 19 KiB/s wr, 4 op/s
Feb 23 10:01:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 do_prune osdmap full prune enabled
Feb 23 10:01:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 e207: 6 total, 6 up, 6 in
Feb 23 10:01:17 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in
Feb 23 10:01:18 np0005626463.localdomain ceph-mon[294160]: pgmap v429: 177 pgs: 177 active+clean; 193 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 463 B/s rd, 24 KiB/s wr, 4 op/s
Feb 23 10:01:18 np0005626463.localdomain ceph-mon[294160]: osdmap e207: 6 total, 6 up, 6 in
Feb 23 10:01:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:18.809 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:01:18 np0005626463.localdomain podman[319296]: 2026-02-23 10:01:18.914711126 +0000 UTC m=+0.087966754 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 10:01:18 np0005626463.localdomain podman[319296]: 2026-02-23 10:01:18.954386181 +0000 UTC m=+0.127641810 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:01:18 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:01:20 np0005626463.localdomain ceph-mon[294160]: pgmap v431: 177 pgs: 177 active+clean; 193 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 8.0 KiB/s wr, 1 op/s
Feb 23 10:01:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:01:20 np0005626463.localdomain podman[319315]: 2026-02-23 10:01:20.925562619 +0000 UTC m=+0.090059219 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216)
Feb 23 10:01:20 np0005626463.localdomain podman[319315]: 2026-02-23 10:01:20.934772278 +0000 UTC m=+0.099268878 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 23 10:01:20 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:01:21 np0005626463.localdomain ceph-mon[294160]: pgmap v432: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 8.4 KiB/s wr, 1 op/s
Feb 23 10:01:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:23.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:01:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:01:23 np0005626463.localdomain ceph-mon[294160]: pgmap v433: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 7.4 KiB/s wr, 1 op/s
Feb 23 10:01:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:23.811 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:24 np0005626463.localdomain sudo[319334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:01:24 np0005626463.localdomain sudo[319334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:01:24 np0005626463.localdomain sudo[319334]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:24 np0005626463.localdomain sudo[319352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 23 10:01:24 np0005626463.localdomain sudo[319352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:01:25 np0005626463.localdomain systemd[1]: tmp-crun.tv0G3v.mount: Deactivated successfully.
Feb 23 10:01:25 np0005626463.localdomain podman[319443]: 2026-02-23 10:01:25.19992067 +0000 UTC m=+0.090853915 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, release=1770267347, version=7, io.openshift.tags=rhceph ceph)
Feb 23 10:01:25 np0005626463.localdomain podman[319443]: 2026-02-23 10:01:25.338176522 +0000 UTC m=+0.229109717 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:25 np0005626463.localdomain sudo[319352]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 10:01:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain sudo[319563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:01:26 np0005626463.localdomain sudo[319563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:01:26 np0005626463.localdomain sudo[319563]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:26 np0005626463.localdomain sudo[319581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:01:26 np0005626463.localdomain sudo[319581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: pgmap v434: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 7.4 KiB/s wr, 1 op/s
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 10:01:26 np0005626463.localdomain sudo[319581]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:01:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:26 np0005626463.localdomain sudo[319630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:01:26 np0005626463.localdomain sudo[319630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:01:26 np0005626463.localdomain sudo[319630]: pam_unix(sudo:session): session closed for user root
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:01:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 23 10:01:28 np0005626463.localdomain ceph-mon[294160]: pgmap v435: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 1.8 KiB/s wr, 0 op/s
Feb 23 10:01:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:28.814 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:30 np0005626463.localdomain ceph-mon[294160]: pgmap v436: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s wr, 0 op/s
Feb 23 10:01:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:01:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:01:31 np0005626463.localdomain ceph-mon[294160]: pgmap v437: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s wr, 0 op/s
Feb 23 10:01:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:01:31 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:01:31 np0005626463.localdomain podman[319649]: 2026-02-23 10:01:31.911331475 +0000 UTC m=+0.079990632 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:01:31 np0005626463.localdomain podman[319649]: 2026-02-23 10:01:31.917650105 +0000 UTC m=+0.086309212 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:01:31 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:01:31 np0005626463.localdomain podman[319648]: 2026-02-23 10:01:31.971847716 +0000 UTC m=+0.142475135 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 10:01:31 np0005626463.localdomain podman[319648]: 2026-02-23 10:01:31.984662028 +0000 UTC m=+0.155289457 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:01:31 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:01:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:33 np0005626463.localdomain ceph-mon[294160]: pgmap v438: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:01:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:33.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:01:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:33.818 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:33.818 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:01:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:33.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:01:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:33.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:01:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:35.290 282211 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.45 sec
Feb 23 10:01:36 np0005626463.localdomain ceph-mon[294160]: pgmap v439: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:01:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1094906826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4257796391' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:36.384 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:01:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:36.384 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:01:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:36.385 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:01:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:36.385 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2581822653' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2908369641' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:37.391 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:01:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:37.392 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:37.419 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "1be5156e-560a-4ea6-aa3b-098d527fc684", "format": "json"}]: dispatch
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: pgmap v440: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "auth_id": "Joe", "tenant_id": "1b9d2e21adaa4adab3e6f69b48abf75a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.426 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.453 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.453 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.454 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.454 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.456 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.456 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.457 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.485 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.859 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:01:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1410413463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:38.946 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.052 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.053 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.287 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11283MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1410413463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:01:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:01:39 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:39.394 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:01:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:01:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:01:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:01:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18827 "" "Go-http-client/1.1"
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.441 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.442 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.443 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.539 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:01:39 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:39.677 2 INFO neutron.agent.securitygroups_rpc [None req-0ebe300c-0924-4a3a-86fe-0f106df51381 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:01:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2001042167' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.978 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:01:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:39.985 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:01:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:40.013 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:01:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:40.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:01:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:40.018 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:01:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:01:40Z|00324|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 10:01:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "836ea875-40c0-4d41-ac93-96ac1cabc343", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "836ea875-40c0-4d41-ac93-96ac1cabc343", "format": "json"}]: dispatch
Feb 23 10:01:40 np0005626463.localdomain ceph-mon[294160]: pgmap v441: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail
Feb 23 10:01:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2001042167' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "format": "json"}]: dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a687b09a-63b9-4132-9be0-38d64393e4b6", "format": "json"}]: dispatch
Feb 23 10:01:41 np0005626463.localdomain ceph-mon[294160]: pgmap v442: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 2 op/s
Feb 23 10:01:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/683222515' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/683222515' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1851732089' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1851732089' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:01:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:01:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:01:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:01:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:01:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:01:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3589222603' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3589222603' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:43 np0005626463.localdomain ceph-mon[294160]: pgmap v443: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 2 op/s
Feb 23 10:01:43 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:43.815 2 INFO neutron.agent.securitygroups_rpc [None req-72581ce1-6ecd-4794-bdfa-9dea488ea111 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['48663913-ae52-424c-8374-b7539096caba']
Feb 23 10:01:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:01:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:01:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:43.862 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:01:43 np0005626463.localdomain systemd[1]: tmp-crun.IRThQN.mount: Deactivated successfully.
Feb 23 10:01:43 np0005626463.localdomain podman[319736]: 2026-02-23 10:01:43.941709558 +0000 UTC m=+0.105680160 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:01:43 np0005626463.localdomain podman[319736]: 2026-02-23 10:01:43.9545088 +0000 UTC m=+0.118479392 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:01:43 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:01:44 np0005626463.localdomain podman[319735]: 2026-02-23 10:01:44.037443225 +0000 UTC m=+0.204189924 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller)
Feb 23 10:01:44 np0005626463.localdomain podman[319735]: 2026-02-23 10:01:44.108567098 +0000 UTC m=+0.275313747 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Feb 23 10:01:44 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:01:44 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 23 10:01:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "Joe", "tenant_id": "4d2b2d5862b8427aac5a9c709976e3ff", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:01:45 np0005626463.localdomain ceph-mon[294160]: pgmap v444: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 2 op/s
Feb 23 10:01:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "836ea875-40c0-4d41-ac93-96ac1cabc343", "format": "json"}]: dispatch
Feb 23 10:01:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:45.596 2 INFO neutron.agent.securitygroups_rpc [None req-97c39ba0-e75b-404a-baf7-3d9225783656 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['48663913-ae52-424c-8374-b7539096caba', '3384fe18-0fab-4c8a-9159-5a07fe1d4f48']
Feb 23 10:01:45 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:45.971 2 INFO neutron.agent.securitygroups_rpc [None req-62619d1c-ae99-4a07-8196-e49ad1562f12 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['3384fe18-0fab-4c8a-9159-5a07fe1d4f48']
Feb 23 10:01:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "836ea875-40c0-4d41-ac93-96ac1cabc343", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "f524bd25-e758-44bb-a21d-5fd935532860", "format": "json"}]: dispatch
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: pgmap v445: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 3 op/s
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-557795333", "format": "json"} : dispatch
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:01:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:48.013 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:48.013 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:01:48 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:48.558 2 INFO neutron.agent.securitygroups_rpc [None req-deb2d108-d73d-4f3b-b409-b0d5aa38dbb2 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['7ccfcbe5-3a12-4044-a554-c033a2966e5e']
Feb 23 10:01:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:01:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:01:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:01:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:01:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "tempest-cephx-id-557795333", "tenant_id": "4d2b2d5862b8427aac5a9c709976e3ff", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:01:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:01:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:01:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:48.865 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "f524bd25-e758-44bb-a21d-5fd935532860_b917d76c-b777-4089-bc48-72bb8837214f", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "f524bd25-e758-44bb-a21d-5fd935532860", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:49 np0005626463.localdomain ceph-mon[294160]: pgmap v446: 177 pgs: 177 active+clean; 193 MiB data, 948 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 3 op/s
Feb 23 10:01:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:01:49 np0005626463.localdomain podman[319783]: 2026-02-23 10:01:49.909952766 +0000 UTC m=+0.084144494 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:01:49 np0005626463.localdomain podman[319783]: 2026-02-23 10:01:49.923290375 +0000 UTC m=+0.097482103 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Feb 23 10:01:49 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:01:50 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:50.145 2 INFO neutron.agent.securitygroups_rpc [None req-80e2a041-7b6e-4f6c-b102-2630aa52a7b1 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['fdc27d14-90ab-419e-a68e-458d1f31be69', 'abc26baf-67f8-4703-8e61-63db3bbb7b3b', '7ccfcbe5-3a12-4044-a554-c033a2966e5e']
Feb 23 10:01:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:50 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:50 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:50.833 2 INFO neutron.agent.securitygroups_rpc [None req-5d850956-08e8-4589-ad52-05619933d70c 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['abc26baf-67f8-4703-8e61-63db3bbb7b3b', 'fdc27d14-90ab-419e-a68e-458d1f31be69']
Feb 23 10:01:51 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24],prefix=session evict} (starting...)
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cff56037-7760-4e22-a017-c787f60f0646", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cff56037-7760-4e22-a017-c787f60f0646", "format": "json"}]: dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/158326346' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/158326346' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 23 10:01:51 np0005626463.localdomain ceph-mon[294160]: pgmap v447: 177 pgs: 177 active+clean; 193 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 2.1 KiB/s rd, 45 KiB/s wr, 10 op/s
Feb 23 10:01:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:01:51 np0005626463.localdomain podman[319802]: 2026-02-23 10:01:51.910100231 +0000 UTC m=+0.085558548 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 10:01:51 np0005626463.localdomain podman[319802]: 2026-02-23 10:01:51.918278338 +0000 UTC m=+0.093737135 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:01:51 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:01:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a687b09a-63b9-4132-9be0-38d64393e4b6_ec6ac042-cb02-4a02-a780-56bcc558597b", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a687b09a-63b9-4132-9be0-38d64393e4b6", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:53 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:01:53.182 2 INFO neutron.agent.securitygroups_rpc [None req-6b3edf75-e2b6-4f03-a982-c5a8a9628048 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']
Feb 23 10:01:53 np0005626463.localdomain ceph-mon[294160]: pgmap v448: 177 pgs: 177 active+clean; 193 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 32 KiB/s wr, 19 op/s
Feb 23 10:01:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:53 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:53.869 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} v 0)
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"}]': finished
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e9472a91-114f-4670-a2a3-a4947279ea50", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e9472a91-114f-4670-a2a3-a4947279ea50", "format": "json"}]: dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-557795333", "format": "json"} : dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch
Feb 23 10:01:54 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-557795333,client_metadata.root=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24],prefix=session evict} (starting...)
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 do_prune osdmap full prune enabled
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "tempest-cephx-id-557795333", "format": "json"}]: dispatch
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"}]': finished
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "auth_id": "tempest-cephx-id-557795333", "format": "json"}]: dispatch
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: pgmap v449: 177 pgs: 177 active+clean; 193 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 32 KiB/s wr, 19 op/s
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 e208: 6 total, 6 up, 6 in
Feb 23 10:01:55 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in
Feb 23 10:01:56 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "1be5156e-560a-4ea6-aa3b-098d527fc684_4337db94-db06-4427-a272-8f09abd769af", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:56 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "1be5156e-560a-4ea6-aa3b-098d527fc684", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:56 np0005626463.localdomain ceph-mon[294160]: osdmap e208: 6 total, 6 up, 6 in
Feb 23 10:01:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:01:57 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cff56037-7760-4e22-a017-c787f60f0646", "format": "json"}]: dispatch
Feb 23 10:01:57 np0005626463.localdomain ceph-mon[294160]: pgmap v451: 177 pgs: 177 active+clean; 193 MiB data, 950 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 66 KiB/s wr, 27 op/s
Feb 23 10:01:57 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cff56037-7760-4e22-a017-c787f60f0646", "force": true, "format": "json"}]: dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0)
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Feb 23 10:01:58 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7],prefix=session evict} (starting...)
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 23 10:01:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:01:58.871 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:01:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 do_prune osdmap full prune enabled
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "format": "json"}]: dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "5dbe1f82-6e63-4670-abcb-d97d24ea7f3d_a57b3e42-f791-4c80-9397-393854388632", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "5dbe1f82-6e63-4670-abcb-d97d24ea7f3d", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: pgmap v452: 177 pgs: 177 active+clean; 193 MiB data, 950 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 66 KiB/s wr, 27 op/s
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e209 e209: 6 total, 6 up, 6 in
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e209 do_prune osdmap full prune enabled
Feb 23 10:02:01 np0005626463.localdomain ceph-mon[294160]: osdmap e209: 6 total, 6 up, 6 in
Feb 23 10:02:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 e210: 6 total, 6 up, 6 in
Feb 23 10:02:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in
Feb 23 10:02:01 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:01.123 265541 INFO neutron.agent.linux.ip_lib [None req-52c672d4-2817-4e80-a6b6-55e7dbc6cefe - - - - - -] Device tap1dc46c99-c0 cannot be used as it has no MAC address
Feb 23 10:02:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:01.147 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:01 np0005626463.localdomain kernel: device tap1dc46c99-c0 entered promiscuous mode
Feb 23 10:02:01 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840921.1556] manager: (tap1dc46c99-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Feb 23 10:02:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:01.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:01 np0005626463.localdomain systemd-udevd[319829]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:02:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:01Z|00325|binding|INFO|Claiming lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 for this chassis.
Feb 23 10:02:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:01Z|00326|binding|INFO|1dc46c99-c0ab-47c0-8b05-6120a8497956: Claiming unknown
Feb 23 10:02:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:01Z|00327|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 ovn-installed in OVS
Feb 23 10:02:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:01.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:01.234 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:01.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:01 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:01Z|00328|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 up in Southbound
Feb 23 10:02:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:01.281 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02994efd-16d1-4091-9839-70c330f56226', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02994efd-16d1-4091-9839-70c330f56226', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a05abb56-1acf-40c9-887f-cac11ff4663b, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1dc46c99-c0ab-47c0-8b05-6120a8497956) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:01.283 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1dc46c99-c0ab-47c0-8b05-6120a8497956 in datapath 02994efd-16d1-4091-9839-70c330f56226 bound to our chassis
Feb 23 10:02:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:01.285 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02994efd-16d1-4091-9839-70c330f56226 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:01 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:01.288 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1664133f-1441-4b7b-9912-4146b871877a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e9472a91-114f-4670-a2a3-a4947279ea50", "format": "json"}]: dispatch
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e9472a91-114f-4670-a2a3-a4947279ea50", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "format": "json"}]: dispatch
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: osdmap e210: 6 total, 6 up, 6 in
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: pgmap v455: 177 pgs: 177 active+clean; 194 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 146 KiB/s wr, 19 op/s
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Feb 23 10:02:02 np0005626463.localdomain podman[319883]: 
Feb 23 10:02:02 np0005626463.localdomain podman[319883]: 2026-02-23 10:02:02.12316297 +0000 UTC m=+0.103431879 container create e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: Started libpod-conmon-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope.
Feb 23 10:02:02 np0005626463.localdomain podman[319883]: 2026-02-23 10:02:02.073155979 +0000 UTC m=+0.053424908 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:02:02 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5625997ec8f95dd5a95ab1651fea2d62360cf9a611d1d9f87c7783ab5b51320/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:02:02 np0005626463.localdomain podman[319883]: 2026-02-23 10:02:02.205554437 +0000 UTC m=+0.185823336 container init e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:02 np0005626463.localdomain podman[319883]: 2026-02-23 10:02:02.216209502 +0000 UTC m=+0.196478401 container start e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: started, version 2.85 cachesize 150
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: DNS service limited to local subnets
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: warning: no upstream servers configured
Feb 23 10:02:02 np0005626463.localdomain dnsmasq-dhcp[319923]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 0 addresses
Feb 23 10:02:02 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host
Feb 23 10:02:02 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts
Feb 23 10:02:02 np0005626463.localdomain podman[319897]: 2026-02-23 10:02:02.268263137 +0000 UTC m=+0.100011372 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Feb 23 10:02:02 np0005626463.localdomain podman[319897]: 2026-02-23 10:02:02.277719734 +0000 UTC m=+0.109467919 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9)
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:02:02 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:02.339 265541 INFO neutron.agent.dhcp.agent [None req-6b4183d0-b8bf-44ce-82c6-0e22de4cb49a - - - - - -] DHCP configuration for ports {'12437aba-a85e-444a-8c2a-043bc899c9b2'} is completed
Feb 23 10:02:02 np0005626463.localdomain podman[319898]: 2026-02-23 10:02:02.370504628 +0000 UTC m=+0.197669099 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:02:02 np0005626463.localdomain podman[319898]: 2026-02-23 10:02:02.41131933 +0000 UTC m=+0.238483811 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:02:02 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 do_prune osdmap full prune enabled
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e211 e211: 6 total, 6 up, 6 in
Feb 23 10:02:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in
Feb 23 10:02:02 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:02.646 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:02Z, description=, device_id=d11c654a-ee3c-4ca8-93a5-f268ff8b2e3b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282913c7c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282913c8e0>], id=11716704-7d2e-4ac8-ab68-6757822cada4, ip_allocation=immediate, mac_address=fa:16:3e:fc:79:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:01:58Z, description=, dns_domain=, id=02994efd-16d1-4091-9839-70c330f56226, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1823901390, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41387, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2973, status=ACTIVE, subnets=['cdc6b546-53fa-420b-b344-ddddb12b955f'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:00Z, vlan_transparent=None, network_id=02994efd-16d1-4091-9839-70c330f56226, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2982, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:02Z on network 02994efd-16d1-4091-9839-70c330f56226
Feb 23 10:02:02 np0005626463.localdomain dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 1 addresses
Feb 23 10:02:02 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host
Feb 23 10:02:02 np0005626463.localdomain podman[319960]: 2026-02-23 10:02:02.830673199 +0000 UTC m=+0.057964940 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 10:02:02 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts
Feb 23 10:02:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "auth_id": "admin", "tenant_id": "1b9d2e21adaa4adab3e6f69b48abf75a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "e9f999c5-8797-4bf3-86d8-5475823561e0", "format": "json"}]: dispatch
Feb 23 10:02:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a6576274-fdc3-4c38-8214-f23c6386f2cf_f3807254-0afd-4ece-ae6b-65f6e63da3b8", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "snap_name": "a6576274-fdc3-4c38-8214-f23c6386f2cf", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:03 np0005626463.localdomain ceph-mon[294160]: osdmap e211: 6 total, 6 up, 6 in
Feb 23 10:02:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:03.172 265541 INFO neutron.agent.dhcp.agent [None req-a6593d7d-3d1a-41bb-9271-b655eb5b020a - - - - - -] DHCP configuration for ports {'11716704-7d2e-4ac8-ab68-6757822cada4'} is completed
Feb 23 10:02:03 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:03.640 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:02Z, description=, device_id=d11c654a-ee3c-4ca8-93a5-f268ff8b2e3b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829189a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829300f10>], id=11716704-7d2e-4ac8-ab68-6757822cada4, ip_allocation=immediate, mac_address=fa:16:3e:fc:79:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:01:58Z, description=, dns_domain=, id=02994efd-16d1-4091-9839-70c330f56226, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1823901390, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41387, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2973, status=ACTIVE, subnets=['cdc6b546-53fa-420b-b344-ddddb12b955f'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:00Z, vlan_transparent=None, network_id=02994efd-16d1-4091-9839-70c330f56226, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2982, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:02Z on network 02994efd-16d1-4091-9839-70c330f56226
Feb 23 10:02:03 np0005626463.localdomain dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 1 addresses
Feb 23 10:02:03 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host
Feb 23 10:02:03 np0005626463.localdomain podman[319998]: 2026-02-23 10:02:03.832562254 +0000 UTC m=+0.062379179 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 10:02:03 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts
Feb 23 10:02:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:03.875 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:04 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:04.113 265541 INFO neutron.agent.dhcp.agent [None req-df82d37c-32a0-42a4-83dd-76b9a0e4fa20 - - - - - -] DHCP configuration for ports {'11716704-7d2e-4ac8-ab68-6757822cada4'} is completed
Feb 23 10:02:04 np0005626463.localdomain ceph-mon[294160]: pgmap v457: 177 pgs: 177 active+clean; 194 MiB data, 952 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 76 KiB/s wr, 11 op/s
Feb 23 10:02:04 np0005626463.localdomain dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 0 addresses
Feb 23 10:02:04 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host
Feb 23 10:02:04 np0005626463.localdomain dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts
Feb 23 10:02:04 np0005626463.localdomain podman[320034]: 2026-02-23 10:02:04.473003059 +0000 UTC m=+0.061841563 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 10:02:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:04 np0005626463.localdomain kernel: device tap1dc46c99-c0 left promiscuous mode
Feb 23 10:02:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:04.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:04Z|00329|binding|INFO|Releasing lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 from this chassis (sb_readonly=0)
Feb 23 10:02:04 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:04Z|00330|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 down in Southbound
Feb 23 10:02:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:04.740 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02994efd-16d1-4091-9839-70c330f56226', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02994efd-16d1-4091-9839-70c330f56226', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a05abb56-1acf-40c9-887f-cac11ff4663b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1dc46c99-c0ab-47c0-8b05-6120a8497956) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:04.743 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1dc46c99-c0ab-47c0-8b05-6120a8497956 in datapath 02994efd-16d1-4091-9839-70c330f56226 unbound from our chassis
Feb 23 10:02:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:04.745 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02994efd-16d1-4091-9839-70c330f56226 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:04 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:04.746 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd0397e-6ec1-4464-a43e-09584b77d699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:04.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "snap_name": "4304fccc-23e4-4511-ab8d-92fee67e08e5", "format": "json"}]: dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/989643730' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/989643730' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.190899) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925190935, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2560, "num_deletes": 275, "total_data_size": 3244680, "memory_usage": 3296672, "flush_reason": "Manual Compaction"}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925212796, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3180588, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30559, "largest_seqno": 33118, "table_properties": {"data_size": 3169010, "index_size": 7443, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 28004, "raw_average_key_size": 22, "raw_value_size": 3144663, "raw_average_value_size": 2579, "num_data_blocks": 311, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 275, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840797, "oldest_key_time": 1771840797, "file_creation_time": 1771840925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 21994 microseconds, and 7458 cpu microseconds.
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.212850) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3180588 bytes OK
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.212915) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215294) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215315) EVENT_LOG_v1 {"time_micros": 1771840925215309, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215339) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3233133, prev total WAL file size 3233133, number of live WAL files 2.
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.216194) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3106KB)], [57(14MB)]
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925216272, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18594771, "oldest_snapshot_seqno": -1}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13268 keys, 17389646 bytes, temperature: kUnknown
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925324551, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 17389646, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17314048, "index_size": 41321, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33221, "raw_key_size": 355930, "raw_average_key_size": 26, "raw_value_size": 17088119, "raw_average_value_size": 1287, "num_data_blocks": 1557, "num_entries": 13268, "num_filter_entries": 13268, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.325011) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 17389646 bytes
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.328006) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.6 rd, 160.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 14.7 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(11.3) write-amplify(5.5) OK, records in: 13832, records dropped: 564 output_compression: NoCompression
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.328037) EVENT_LOG_v1 {"time_micros": 1771840925328023, "job": 34, "event": "compaction_finished", "compaction_time_micros": 108357, "compaction_time_cpu_micros": 47967, "output_level": 6, "num_output_files": 1, "total_output_size": 17389646, "num_input_records": 13832, "num_output_records": 13268, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925328623, "job": 34, "event": "table_file_deletion", "file_number": 59}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925331478, "job": 34, "event": "table_file_deletion", "file_number": 57}
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.216085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e211 do_prune osdmap full prune enabled
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 e212: 6 total, 6 up, 6 in
Feb 23 10:02:05 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in
Feb 23 10:02:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "auth_id": "david", "tenant_id": "1b9d2e21adaa4adab3e6f69b48abf75a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:06 np0005626463.localdomain ceph-mon[294160]: pgmap v458: 177 pgs: 177 active+clean; 194 MiB data, 952 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 76 KiB/s wr, 11 op/s
Feb 23 10:02:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "format": "json"}]: dispatch
Feb 23 10:02:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7b2a2c48-552a-4612-b861-e74517b6032f", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:06 np0005626463.localdomain ceph-mon[294160]: osdmap e212: 6 total, 6 up, 6 in
Feb 23 10:02:06 np0005626463.localdomain dnsmasq[319923]: exiting on receipt of SIGTERM
Feb 23 10:02:06 np0005626463.localdomain systemd[1]: libpod-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope: Deactivated successfully.
Feb 23 10:02:06 np0005626463.localdomain podman[320075]: 2026-02-23 10:02:06.271012835 +0000 UTC m=+0.071607339 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:02:06 np0005626463.localdomain podman[320087]: 2026-02-23 10:02:06.347508238 +0000 UTC m=+0.065105475 container died e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:06 np0005626463.localdomain podman[320087]: 2026-02-23 10:02:06.386061009 +0000 UTC m=+0.103658206 container cleanup e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:06 np0005626463.localdomain systemd[1]: libpod-conmon-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope: Deactivated successfully.
Feb 23 10:02:06 np0005626463.localdomain podman[320089]: 2026-02-23 10:02:06.423710341 +0000 UTC m=+0.133568196 container remove e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 10:02:06 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:06.457 265541 INFO neutron.agent.dhcp.agent [None req-264e6601-72c4-439f-84bc-7e07da93162e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:06 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:06.644 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:06 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:06Z|00331|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:07 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:07.063 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "dc78da5f-f2b8-4261-af0a-8d02dae8fe58", "format": "json"}]: dispatch
Feb 23 10:02:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-b5625997ec8f95dd5a95ab1651fea2d62360cf9a611d1d9f87c7783ab5b51320-merged.mount: Deactivated successfully.
Feb 23 10:02:07 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805-userdata-shm.mount: Deactivated successfully.
Feb 23 10:02:07 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d02994efd\x2d16d1\x2d4091\x2d9839\x2d70c330f56226.mount: Deactivated successfully.
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 do_prune osdmap full prune enabled
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e213 e213: 6 total, 6 up, 6 in
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:08 np0005626463.localdomain ceph-mon[294160]: pgmap v460: 177 pgs: 177 active+clean; 194 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 140 KiB/s wr, 49 op/s
Feb 23 10:02:08 np0005626463.localdomain ceph-mon[294160]: osdmap e213: 6 total, 6 up, 6 in
Feb 23 10:02:08 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1962019075' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:08 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1962019075' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:08.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:02:09.223 2 INFO neutron.agent.securitygroups_rpc [None req-34a9c0cf-78ea-4a50-a570-1c89dfa87f59 ccd9ce6e3fef42b59d2107f1a22eac97 68a48b471ed84048aeb651374fff5111 - - default default] Security group member updated ['712b70a2-0074-4f4c-8d5a-c22b0f563b07']
Feb 23 10:02:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "snap_name": "4304fccc-23e4-4511-ab8d-92fee67e08e5_64c05dae-7e24-484d-a0fc-c9c71e45796d", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "snap_name": "4304fccc-23e4-4511-ab8d-92fee67e08e5", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "format": "json"}]: dispatch
Feb 23 10:02:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:02:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:02:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:02:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:02:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:02:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18825 "" "Go-http-client/1.1"
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e213 do_prune osdmap full prune enabled
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e214 e214: 6 total, 6 up, 6 in
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "dc78da5f-f2b8-4261-af0a-8d02dae8fe58_20af97ff-9f0c-4a59-ae24-cffe3dd99ad7", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "dc78da5f-f2b8-4261-af0a-8d02dae8fe58", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:10 np0005626463.localdomain ceph-mon[294160]: pgmap v462: 177 pgs: 177 active+clean; 194 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 60 KiB/s wr, 34 op/s
Feb 23 10:02:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e214 do_prune osdmap full prune enabled
Feb 23 10:02:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 e215: 6 total, 6 up, 6 in
Feb 23 10:02:11 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in
Feb 23 10:02:11 np0005626463.localdomain ceph-mon[294160]: osdmap e214: 6 total, 6 up, 6 in
Feb 23 10:02:11 np0005626463.localdomain ceph-mon[294160]: osdmap e215: 6 total, 6 up, 6 in
Feb 23 10:02:11 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:02:11.788 2 INFO neutron.agent.securitygroups_rpc [None req-90d4154c-098d-45e4-8f77-47336730be40 ccd9ce6e3fef42b59d2107f1a22eac97 68a48b471ed84048aeb651374fff5111 - - default default] Security group member updated ['712b70a2-0074-4f4c-8d5a-c22b0f563b07']
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "format": "json"}]: dispatch
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3ce246a5-9df7-419d-94b3-cd5751260b5f", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "auth_id": "david", "tenant_id": "4d2b2d5862b8427aac5a9c709976e3ff", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: pgmap v464: 177 pgs: 177 active+clean; 194 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 127 KiB/s wr, 52 op/s
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 do_prune osdmap full prune enabled
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 e216: 6 total, 6 up, 6 in
Feb 23 10:02:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in
Feb 23 10:02:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "e9f999c5-8797-4bf3-86d8-5475823561e0_76bf6727-66ba-4b02-aadb-4af6ec6b8ba3", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "snap_name": "e9f999c5-8797-4bf3-86d8-5475823561e0", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:13 np0005626463.localdomain ceph-mon[294160]: osdmap e216: 6 total, 6 up, 6 in
Feb 23 10:02:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:02:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:02:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:02:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:02:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:02:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.882 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:02:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:13.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:14 np0005626463.localdomain ceph-mon[294160]: pgmap v467: 177 pgs: 177 active+clean; 194 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 62 KiB/s wr, 36 op/s
Feb 23 10:02:14 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/d958ade4-5b7f-45eb-b23d-cb42046e5d2f/7b3f1ec9-37eb-4781-a710-21b4c27d3f21],prefix=session evict} (starting...)
Feb 23 10:02:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:02:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:02:14 np0005626463.localdomain podman[320117]: 2026-02-23 10:02:14.919354303 +0000 UTC m=+0.087272563 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216)
Feb 23 10:02:14 np0005626463.localdomain systemd[1]: tmp-crun.vu5Fs6.mount: Deactivated successfully.
Feb 23 10:02:14 np0005626463.localdomain podman[320117]: 2026-02-23 10:02:14.981652609 +0000 UTC m=+0.149570849 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 10:02:14 np0005626463.localdomain podman[320118]: 2026-02-23 10:02:14.984464908 +0000 UTC m=+0.146782542 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:02:15 np0005626463.localdomain podman[320118]: 2026-02-23 10:02:15.017463304 +0000 UTC m=+0.179780888 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:02:15 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:02:15 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:02:15 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:15.348 265541 INFO neutron.agent.linux.ip_lib [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Device tap1634e81d-e7 cannot be used as it has no MAC address
Feb 23 10:02:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:15.375 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:15 np0005626463.localdomain kernel: device tap1634e81d-e7 entered promiscuous mode
Feb 23 10:02:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:15.386 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:15Z|00332|binding|INFO|Claiming lport 1634e81d-e702-4faf-9e73-2065f8f0e08b for this chassis.
Feb 23 10:02:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:15Z|00333|binding|INFO|1634e81d-e702-4faf-9e73-2065f8f0e08b: Claiming unknown
Feb 23 10:02:15 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840935.3947] manager: (tap1634e81d-e7): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Feb 23 10:02:15 np0005626463.localdomain systemd-udevd[320176]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:02:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:15.401 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f42d0d0d-6ed8-4a36-9fc6-62bf3138a3ca, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1634e81d-e702-4faf-9e73-2065f8f0e08b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:15.404 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1634e81d-e702-4faf-9e73-2065f8f0e08b in datapath e63b9444-12b5-401f-bd30-af34ee321bad bound to our chassis
Feb 23 10:02:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:15.406 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e63b9444-12b5-401f-bd30-af34ee321bad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:15.408 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[49087957-68ee-4269-b096-6678990299d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:15Z|00334|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b ovn-installed in OVS
Feb 23 10:02:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:15Z|00335|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b up in Southbound
Feb 23 10:02:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:15.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap1634e81d-e7: No such device
Feb 23 10:02:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:15.476 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:15.510 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:16 np0005626463.localdomain podman[320247]: 
Feb 23 10:02:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "auth_id": "david", "format": "json"}]: dispatch
Feb 23 10:02:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "auth_id": "david", "format": "json"}]: dispatch
Feb 23 10:02:16 np0005626463.localdomain ceph-mon[294160]: pgmap v468: 177 pgs: 177 active+clean; 194 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 60 KiB/s wr, 34 op/s
Feb 23 10:02:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "format": "json"}]: dispatch
Feb 23 10:02:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "90cc3386-9efe-4ed5-a2c9-f8ae76a69843", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:16 np0005626463.localdomain podman[320247]: 2026-02-23 10:02:16.355526786 +0000 UTC m=+0.101334403 container create 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:02:16 np0005626463.localdomain systemd[1]: Started libpod-conmon-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope.
Feb 23 10:02:16 np0005626463.localdomain podman[320247]: 2026-02-23 10:02:16.308317324 +0000 UTC m=+0.054124981 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:02:16 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:02:16 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ebec7039e60db65acdf7c0ecd16e923040d414148abcea66fbe688ace2f9805/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:02:16 np0005626463.localdomain podman[320247]: 2026-02-23 10:02:16.435082176 +0000 UTC m=+0.180889793 container init 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:02:16 np0005626463.localdomain podman[320247]: 2026-02-23 10:02:16.441008242 +0000 UTC m=+0.186815869 container start 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: started, version 2.85 cachesize 150
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: DNS service limited to local subnets
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: warning: no upstream servers configured
Feb 23 10:02:16 np0005626463.localdomain dnsmasq-dhcp[320265]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 0 addresses
Feb 23 10:02:16 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host
Feb 23 10:02:16 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts
Feb 23 10:02:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.507 265541 INFO neutron.agent.dhcp.agent [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:15Z, description=, device_id=2cd213c5-1cc0-4ece-97cb-aacda1a21f15, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282914fb50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282919e5b0>], id=a163dd8c-da34-4abd-b418-3c438cf8fe6f, ip_allocation=immediate, mac_address=fa:16:3e:b6:17:9f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:13Z, description=, dns_domain=, id=e63b9444-12b5-401f-bd30-af34ee321bad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1517713891, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3015, status=ACTIVE, subnets=['448cc415-671f-4bd9-897a-4370ca9edf8d'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:14Z, vlan_transparent=None, network_id=e63b9444-12b5-401f-bd30-af34ee321bad, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3031, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:15Z on network e63b9444-12b5-401f-bd30-af34ee321bad
Feb 23 10:02:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.574 265541 INFO neutron.agent.dhcp.agent [None req-01c86bfa-5238-444b-ad0e-daa85ed342cb - - - - - -] DHCP configuration for ports {'a52ef47a-5244-4d45-bfde-ab5e56a9da0d'} is completed
Feb 23 10:02:16 np0005626463.localdomain dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 1 addresses
Feb 23 10:02:16 np0005626463.localdomain podman[320284]: 2026-02-23 10:02:16.720102946 +0000 UTC m=+0.065821858 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 23 10:02:16 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host
Feb 23 10:02:16 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts
Feb 23 10:02:16 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.938 265541 INFO neutron.agent.dhcp.agent [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:15Z, description=, device_id=2cd213c5-1cc0-4ece-97cb-aacda1a21f15, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292468b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282919e5e0>], id=a163dd8c-da34-4abd-b418-3c438cf8fe6f, ip_allocation=immediate, mac_address=fa:16:3e:b6:17:9f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:13Z, description=, dns_domain=, id=e63b9444-12b5-401f-bd30-af34ee321bad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1517713891, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3015, status=ACTIVE, subnets=['448cc415-671f-4bd9-897a-4370ca9edf8d'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:14Z, vlan_transparent=None, network_id=e63b9444-12b5-401f-bd30-af34ee321bad, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3031, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:15Z on network e63b9444-12b5-401f-bd30-af34ee321bad
Feb 23 10:02:17 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:17.031 265541 INFO neutron.agent.dhcp.agent [None req-2fe1288e-0211-40c4-91a3-6b94f1898434 - - - - - -] DHCP configuration for ports {'a163dd8c-da34-4abd-b418-3c438cf8fe6f'} is completed
Feb 23 10:02:17 np0005626463.localdomain dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 1 addresses
Feb 23 10:02:17 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host
Feb 23 10:02:17 np0005626463.localdomain podman[320323]: 2026-02-23 10:02:17.149813262 +0000 UTC m=+0.066424557 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:02:17 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts
Feb 23 10:02:17 np0005626463.localdomain systemd[1]: tmp-crun.GXbD42.mount: Deactivated successfully.
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 do_prune osdmap full prune enabled
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e217 e217: 6 total, 6 up, 6 in
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in
Feb 23 10:02:17 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:17.474 265541 INFO neutron.agent.dhcp.agent [None req-ab67c062-1b9e-4f32-8032-c93b18a893e9 - - - - - -] DHCP configuration for ports {'a163dd8c-da34-4abd-b418-3c438cf8fe6f'} is completed
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:17 np0005626463.localdomain dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 0 addresses
Feb 23 10:02:17 np0005626463.localdomain podman[320360]: 2026-02-23 10:02:17.726974908 +0000 UTC m=+0.070633439 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 10:02:17 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host
Feb 23 10:02:17 np0005626463.localdomain dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts
Feb 23 10:02:17 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:17Z|00336|binding|INFO|Releasing lport 1634e81d-e702-4faf-9e73-2065f8f0e08b from this chassis (sb_readonly=0)
Feb 23 10:02:17 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:17Z|00337|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b down in Southbound
Feb 23 10:02:17 np0005626463.localdomain kernel: device tap1634e81d-e7 left promiscuous mode
Feb 23 10:02:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:17.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:17.957 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f42d0d0d-6ed8-4a36-9fc6-62bf3138a3ca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=1634e81d-e702-4faf-9e73-2065f8f0e08b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:17.959 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1634e81d-e702-4faf-9e73-2065f8f0e08b in datapath e63b9444-12b5-401f-bd30-af34ee321bad unbound from our chassis
Feb 23 10:02:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:17.961 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e63b9444-12b5-401f-bd30-af34ee321bad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:17 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:17.963 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[fc5ce108-ba8a-4843-9849-936642f38428]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:17 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:17.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0)
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Feb 23 10:02:18 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819],prefix=session evict} (starting...)
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: pgmap v469: 177 pgs: 177 active+clean; 195 MiB data, 973 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 97 KiB/s wr, 36 op/s
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: osdmap e217: 6 total, 6 up, 6 in
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/257103341' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/257103341' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e217 do_prune osdmap full prune enabled
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 e218: 6 total, 6 up, 6 in
Feb 23 10:02:18 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in
Feb 23 10:02:18 np0005626463.localdomain dnsmasq[320265]: exiting on receipt of SIGTERM
Feb 23 10:02:18 np0005626463.localdomain podman[320401]: 2026-02-23 10:02:18.840593791 +0000 UTC m=+0.065822367 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:18 np0005626463.localdomain systemd[1]: libpod-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope: Deactivated successfully.
Feb 23 10:02:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:18.886 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:18 np0005626463.localdomain podman[320416]: 2026-02-23 10:02:18.923613319 +0000 UTC m=+0.065447866 container died 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 23 10:02:18 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591-userdata-shm.mount: Deactivated successfully.
Feb 23 10:02:18 np0005626463.localdomain podman[320416]: 2026-02-23 10:02:18.96377533 +0000 UTC m=+0.105609837 container cleanup 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true)
Feb 23 10:02:18 np0005626463.localdomain systemd[1]: libpod-conmon-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope: Deactivated successfully.
Feb 23 10:02:19 np0005626463.localdomain podman[320417]: 2026-02-23 10:02:19.018771017 +0000 UTC m=+0.153028737 container remove 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:02:19 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:19.045 265541 INFO neutron.agent.dhcp.agent [None req-8ad3a064-f5ac-4d06-b8f5-2b5226166d2f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:19 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:19.046 265541 INFO neutron.agent.dhcp.agent [None req-8ad3a064-f5ac-4d06-b8f5-2b5226166d2f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:19 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:19Z|00338|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:19.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4f53052b-0441-47a6-9bd6-84191fbc6dcb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4f53052b-0441-47a6-9bd6-84191fbc6dcb", "format": "json"}]: dispatch
Feb 23 10:02:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "auth_id": "david", "format": "json"}]: dispatch
Feb 23 10:02:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "auth_id": "david", "format": "json"}]: dispatch
Feb 23 10:02:19 np0005626463.localdomain ceph-mon[294160]: osdmap e218: 6 total, 6 up, 6 in
Feb 23 10:02:19 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-2ebec7039e60db65acdf7c0ecd16e923040d414148abcea66fbe688ace2f9805-merged.mount: Deactivated successfully.
Feb 23 10:02:19 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2de63b9444\x2d12b5\x2d401f\x2dbd30\x2daf34ee321bad.mount: Deactivated successfully.
Feb 23 10:02:20 np0005626463.localdomain ceph-mon[294160]: pgmap v472: 177 pgs: 177 active+clean; 195 MiB data, 973 MiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 47 KiB/s wr, 6 op/s
Feb 23 10:02:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:02:20 np0005626463.localdomain podman[320444]: 2026-02-23 10:02:20.931305053 +0000 UTC m=+0.103515092 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:20 np0005626463.localdomain podman[320444]: 2026-02-23 10:02:20.947305425 +0000 UTC m=+0.119515474 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible)
Feb 23 10:02:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:20 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:02:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "22eec113-2c60-4be1-9ecd-9ef9c2418dbe", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "22eec113-2c60-4be1-9ecd-9ef9c2418dbe", "format": "json"}]: dispatch
Feb 23 10:02:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:21 np0005626463.localdomain ceph-mon[294160]: pgmap v473: 177 pgs: 177 active+clean; 195 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 82 KiB/s wr, 15 op/s
Feb 23 10:02:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "format": "json"}]: dispatch
Feb 23 10:02:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d958ade4-5b7f-45eb-b23d-cb42046e5d2f", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:02:22 np0005626463.localdomain systemd[1]: tmp-crun.3lyK1j.mount: Deactivated successfully.
Feb 23 10:02:22 np0005626463.localdomain podman[320463]: 2026-02-23 10:02:22.902299514 +0000 UTC m=+0.082921845 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216)
Feb 23 10:02:22 np0005626463.localdomain podman[320463]: 2026-02-23 10:02:22.912257627 +0000 UTC m=+0.092879928 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_id=ovn_metadata_agent)
Feb 23 10:02:22 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:02:23 np0005626463.localdomain ceph-mon[294160]: pgmap v474: 177 pgs: 177 active+clean; 195 MiB data, 992 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 83 KiB/s wr, 31 op/s
Feb 23 10:02:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:23.889 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.144 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:02:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:02:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "format": "json"}]: dispatch
Feb 23 10:02:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:25.015 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:02:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:25.032 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:02:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:25.033 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:02:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:25.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:25.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4f53052b-0441-47a6-9bd6-84191fbc6dcb", "format": "json"}]: dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4f53052b-0441-47a6-9bd6-84191fbc6dcb", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "22eec113-2c60-4be1-9ecd-9ef9c2418dbe", "format": "json"}]: dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "22eec113-2c60-4be1-9ecd-9ef9c2418dbe", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: pgmap v475: 177 pgs: 177 active+clean; 195 MiB data, 992 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 43 KiB/s wr, 25 op/s
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "format": "json"}]: dispatch
Feb 23 10:02:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "format": "json"}]: dispatch
Feb 23 10:02:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c3aedd71-b342-4920-afd2-d5c6fd4776d2", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2714269398' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:27 np0005626463.localdomain sudo[320481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:02:27 np0005626463.localdomain sudo[320481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:02:27 np0005626463.localdomain sudo[320481]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:27 np0005626463.localdomain sudo[320499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:02:27 np0005626463.localdomain sudo[320499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 do_prune osdmap full prune enabled
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 e219: 6 total, 6 up, 6 in
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2407988189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: pgmap v476: 177 pgs: 177 active+clean; 196 MiB data, 993 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 103 KiB/s wr, 28 op/s
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: osdmap e219: 6 total, 6 up, 6 in
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:27 np0005626463.localdomain sudo[320499]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve49", "tenant_id": "f47d5caa97d244edb5aef31a3870507a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain sudo[320548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:02:28 np0005626463.localdomain sudo[320548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:02:28 np0005626463.localdomain sudo[320548]: pam_unix(sudo:session): session closed for user root
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:28.892 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "auth_id": "tempest-cephx-id-380228807", "tenant_id": "afc38bb20ffe4287899bc080a5fd2741", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-380228807", "format": "json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "format": "json"}]: dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7a63f65b-263e-4f0a-be43-9aace02f6e45", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: pgmap v478: 177 pgs: 177 active+clean; 196 MiB data, 993 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 101 KiB/s wr, 27 op/s
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1226954335' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} v 0)
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch
Feb 23 10:02:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"}]': finished
Feb 23 10:02:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-380228807,client_metadata.root=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15],prefix=session evict} (starting...)
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "auth_id": "tempest-cephx-id-380228807", "format": "json"}]: dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-380228807", "format": "json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"}]': finished
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "auth_id": "tempest-cephx-id-380228807", "format": "json"}]: dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "format": "json"}]: dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bedacb3b-517e-43b1-b025-790f9bc892fc", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/738794719' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.080 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.081 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:02:31 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:31.084 265541 INFO neutron.agent.linux.ip_lib [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Device tap66dde4f3-68 cannot be used as it has no MAC address
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain kernel: device tap66dde4f3-68 entered promiscuous mode
Feb 23 10:02:31 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:31Z|00339|binding|INFO|Claiming lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 for this chassis.
Feb 23 10:02:31 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:31Z|00340|binding|INFO|66dde4f3-68ac-48ac-be4a-678a62b364e3: Claiming unknown
Feb 23 10:02:31 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840951.1190] manager: (tap66dde4f3-68): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.120 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain systemd-udevd[320577]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:02:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:31.135 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7d058c2-7406-4067-bd55-39030093b520, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=66dde4f3-68ac-48ac-be4a-678a62b364e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:31.138 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 66dde4f3-68ac-48ac-be4a-678a62b364e3 in datapath 98deef06-e9d3-4399-8238-57fb5d318b61 bound to our chassis
Feb 23 10:02:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:31.139 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 98deef06-e9d3-4399-8238-57fb5d318b61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:31 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:31.141 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c2660d29-08b7-4da3-a0c0-102bc6b99e86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:31Z|00341|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 ovn-installed in OVS
Feb 23 10:02:31 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:31Z|00342|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 up in Southbound
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap66dde4f3-68: No such device
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/333892643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.569 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve48", "tenant_id": "f47d5caa97d244edb5aef31a3870507a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: pgmap v479: 177 pgs: 177 active+clean; 196 MiB data, 998 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 131 KiB/s wr, 56 op/s
Feb 23 10:02:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/333892643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.744 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.881 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.883 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11269MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.883 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.884 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.970 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.971 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:02:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:31.972 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.030 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:02:32 np0005626463.localdomain podman[320669]: 
Feb 23 10:02:32 np0005626463.localdomain podman[320669]: 2026-02-23 10:02:32.209731581 +0000 UTC m=+0.111617507 container create fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:32 np0005626463.localdomain podman[320669]: 2026-02-23 10:02:32.153382811 +0000 UTC m=+0.055268767 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: Started libpod-conmon-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope.
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:02:32 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9690d5c0a30798f94ac43c813648f7fb8820202cf10f3cc9168198333355fba5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:02:32 np0005626463.localdomain podman[320669]: 2026-02-23 10:02:32.292370335 +0000 UTC m=+0.194256261 container init fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:02:32 np0005626463.localdomain podman[320669]: 2026-02-23 10:02:32.304429144 +0000 UTC m=+0.206315070 container start fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: started, version 2.85 cachesize 150
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: DNS service limited to local subnets
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: warning: no upstream servers configured
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 0 addresses
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts
Feb 23 10:02:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.385 265541 INFO neutron.agent.dhcp.agent [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:30Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291f8ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291f80a0>], id=e72eca06-06ff-46ce-af4c-6d430ebe4ef4, ip_allocation=immediate, mac_address=fa:16:3e:eb:88:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:28Z, description=, dns_domain=, id=98deef06-e9d3-4399-8238-57fb5d318b61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1054964184, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50124, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3089, status=ACTIVE, subnets=['460c027b-0b10-4cdf-b00f-448326ec6496'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:29Z, vlan_transparent=None, network_id=98deef06-e9d3-4399-8238-57fb5d318b61, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3097, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:30Z on network 98deef06-e9d3-4399-8238-57fb5d318b61
Feb 23 10:02:32 np0005626463.localdomain podman[320706]: 2026-02-23 10:02:32.405602292 +0000 UTC m=+0.086509668 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, version=9.7, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Feb 23 10:02:32 np0005626463.localdomain podman[320706]: 2026-02-23 10:02:32.420292063 +0000 UTC m=+0.101199439 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/933428432' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.506 265541 INFO neutron.agent.dhcp.agent [None req-abaa4419-7d88-4741-af99-f9e9c1091cc0 - - - - - -] DHCP configuration for ports {'3e01d07f-b4bf-4688-8209-6f5e13561d8f'} is completed
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.524 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.532 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.554 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.559 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:02:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:32.559 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 1 addresses
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts
Feb 23 10:02:32 np0005626463.localdomain podman[320754]: 2026-02-23 10:02:32.585283945 +0000 UTC m=+0.052860081 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:02:32 np0005626463.localdomain podman[320738]: 2026-02-23 10:02:32.570756559 +0000 UTC m=+0.113086073 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:02:32 np0005626463.localdomain podman[320738]: 2026-02-23 10:02:32.654328013 +0000 UTC m=+0.196657487 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:02:32 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:02:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.753 265541 INFO neutron.agent.dhcp.agent [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:30Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291b4c10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28291b4df0>], id=e72eca06-06ff-46ce-af4c-6d430ebe4ef4, ip_allocation=immediate, mac_address=fa:16:3e:eb:88:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:28Z, description=, dns_domain=, id=98deef06-e9d3-4399-8238-57fb5d318b61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1054964184, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50124, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3089, status=ACTIVE, subnets=['460c027b-0b10-4cdf-b00f-448326ec6496'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:29Z, vlan_transparent=None, network_id=98deef06-e9d3-4399-8238-57fb5d318b61, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3097, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:30Z on network 98deef06-e9d3-4399-8238-57fb5d318b61
Feb 23 10:02:32 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.870 265541 INFO neutron.agent.dhcp.agent [None req-a7d140a0-b06c-4530-b6e5-24e13ca5e839 - - - - - -] DHCP configuration for ports {'e72eca06-06ff-46ce-af4c-6d430ebe4ef4'} is completed
Feb 23 10:02:32 np0005626463.localdomain dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 1 addresses
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host
Feb 23 10:02:32 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts
Feb 23 10:02:32 np0005626463.localdomain podman[320806]: 2026-02-23 10:02:32.961073647 +0000 UTC m=+0.064920500 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "auth_id": "admin", "format": "json"}]: dispatch
Feb 23 10:02:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/933428432' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:02:33 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:33.236 265541 INFO neutron.agent.dhcp.agent [None req-5f53b066-1c39-448a-83f2-27436a925264 - - - - - -] DHCP configuration for ports {'e72eca06-06ff-46ce-af4c-6d430ebe4ef4'} is completed
Feb 23 10:02:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:33.895 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "format": "json"}]: dispatch
Feb 23 10:02:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:33 np0005626463.localdomain ceph-mon[294160]: pgmap v480: 177 pgs: 177 active+clean; 196 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 132 KiB/s wr, 60 op/s
Feb 23 10:02:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0)
Feb 23 10:02:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 23 10:02:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Feb 23 10:02:34 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve48,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...)
Feb 23 10:02:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:34.758 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 23 10:02:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 23 10:02:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 23 10:02:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Feb 23 10:02:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 23 10:02:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 23 10:02:36 np0005626463.localdomain ceph-mon[294160]: pgmap v481: 177 pgs: 177 active+clean; 196 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 132 KiB/s wr, 60 op/s
Feb 23 10:02:36 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:36.324 265541 INFO neutron.agent.linux.ip_lib [None req-327dff01-1ecd-4aea-a992-7bfef9900a0e - - - - - -] Device tap6ea52883-11 cannot be used as it has no MAC address
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.357 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:36 np0005626463.localdomain kernel: device tap6ea52883-11 entered promiscuous mode
Feb 23 10:02:36 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840956.3640] manager: (tap6ea52883-11): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:36 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:36Z|00343|binding|INFO|Claiming lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf for this chassis.
Feb 23 10:02:36 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:36Z|00344|binding|INFO|6ea52883-1131-4fad-9e9e-b739d54ff0bf: Claiming unknown
Feb 23 10:02:36 np0005626463.localdomain systemd-udevd[320836]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:02:36 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:36.376 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ec4f16f-35f7-432c-b2c5-c12ffa9973a2, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6ea52883-1131-4fad-9e9e-b739d54ff0bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:36 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:36.378 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6ea52883-1131-4fad-9e9e-b739d54ff0bf in datapath d54c9f86-28a1-4f1b-8617-dc63ba0e4fee bound to our chassis
Feb 23 10:02:36 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:36.380 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:36 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:36.381 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5498c26e-6729-41d5-a965-1167241240fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:36Z|00345|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf ovn-installed in OVS
Feb 23 10:02:36 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:36Z|00346|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf up in Southbound
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap6ea52883-11: No such device
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.562 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:36.562 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:37 np0005626463.localdomain podman[320907]: 
Feb 23 10:02:37 np0005626463.localdomain podman[320907]: 2026-02-23 10:02:37.400361415 +0000 UTC m=+0.093296641 container create dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 10:02:37 np0005626463.localdomain systemd[1]: Started libpod-conmon-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope.
Feb 23 10:02:37 np0005626463.localdomain podman[320907]: 2026-02-23 10:02:37.354495374 +0000 UTC m=+0.047430630 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:02:37 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:02:37 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7bb5f63af1888222c2e9d63660f7e53fc4d9a2bdaccc92e1e2b415de8d1e37c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:02:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:37 np0005626463.localdomain podman[320907]: 2026-02-23 10:02:37.475267387 +0000 UTC m=+0.168202613 container init dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:02:37 np0005626463.localdomain podman[320907]: 2026-02-23 10:02:37.491317061 +0000 UTC m=+0.184252297 container start dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: started, version 2.85 cachesize 150
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: DNS service limited to local subnets
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: warning: no upstream servers configured
Feb 23 10:02:37 np0005626463.localdomain dnsmasq-dhcp[320925]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 0 addresses
Feb 23 10:02:37 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host
Feb 23 10:02:37 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts
Feb 23 10:02:37 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:37.557 265541 INFO neutron.agent.dhcp.agent [None req-327dff01-1ecd-4aea-a992-7bfef9900a0e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:36Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829102100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829102220>], id=799df518-da07-48da-9a80-2682e9892d43, ip_allocation=immediate, mac_address=fa:16:3e:1e:ee:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:34Z, description=, dns_domain=, id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1408104542, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3119, status=ACTIVE, subnets=['43370c10-c3d6-4ec9-9149-8ef541cda21c'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:35Z, vlan_transparent=None, network_id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3134, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:36Z on network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee
Feb 23 10:02:37 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:37.720 265541 INFO neutron.agent.dhcp.agent [None req-4e58f5dc-c733-4bb1-8f48-2ecc514dc88b - - - - - -] DHCP configuration for ports {'97001ae9-cd83-49b2-b11f-40e665a87f8f'} is completed
Feb 23 10:02:37 np0005626463.localdomain dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 1 addresses
Feb 23 10:02:37 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host
Feb 23 10:02:37 np0005626463.localdomain podman[320944]: 2026-02-23 10:02:37.750125299 +0000 UTC m=+0.061109779 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:02:37 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts
Feb 23 10:02:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:37.807 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:37.808 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:02:37 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:37.810 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:02:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:37.846 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:38.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:02:38 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:38.069 265541 INFO neutron.agent.dhcp.agent [None req-ff4c2e13-8745-4e56-b17d-5ec0d170c869 - - - - - -] DHCP configuration for ports {'799df518-da07-48da-9a80-2682e9892d43'} is completed
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: pgmap v482: 177 pgs: 177 active+clean; 197 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 100 KiB/s wr, 62 op/s
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:38 np0005626463.localdomain systemd[1]: tmp-crun.mWt77a.mount: Deactivated successfully.
Feb 23 10:02:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:38.934 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:39.108 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:36Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28294dbfd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28293a2310>], id=799df518-da07-48da-9a80-2682e9892d43, ip_allocation=immediate, mac_address=fa:16:3e:1e:ee:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:34Z, description=, dns_domain=, id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1408104542, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3119, status=ACTIVE, subnets=['43370c10-c3d6-4ec9-9149-8ef541cda21c'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:35Z, vlan_transparent=None, network_id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3134, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:36Z on network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee
Feb 23 10:02:39 np0005626463.localdomain dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 1 addresses
Feb 23 10:02:39 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host
Feb 23 10:02:39 np0005626463.localdomain podman[320983]: 2026-02-23 10:02:39.305286472 +0000 UTC m=+0.046207783 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:39 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts
Feb 23 10:02:39 np0005626463.localdomain systemd[1]: tmp-crun.g8vMed.mount: Deactivated successfully.
Feb 23 10:02:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve47", "tenant_id": "f47d5caa97d244edb5aef31a3870507a", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:02:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:02:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:02:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160717 "" "Go-http-client/1.1"
Feb 23 10:02:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:02:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19781 "" "Go-http-client/1.1"
Feb 23 10:02:39 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:39.553 265541 INFO neutron.agent.dhcp.agent [None req-094e1568-2d91-4ecc-8884-503d20b890be - - - - - -] DHCP configuration for ports {'799df518-da07-48da-9a80-2682e9892d43'} is completed
Feb 23 10:02:40 np0005626463.localdomain ceph-mon[294160]: pgmap v483: 177 pgs: 177 active+clean; 197 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 85 KiB/s wr, 52 op/s
Feb 23 10:02:41 np0005626463.localdomain ceph-mon[294160]: pgmap v484: 177 pgs: 177 active+clean; 197 MiB data, 1001 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 96 KiB/s wr, 53 op/s
Feb 23 10:02:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0)
Feb 23 10:02:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 23 10:02:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Feb 23 10:02:41 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve47,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...)
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 23 10:02:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:42.896 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:43.129 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:02:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:02:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:02:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:02:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:02:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:02:43 np0005626463.localdomain ceph-mon[294160]: pgmap v485: 177 pgs: 177 active+clean; 197 MiB data, 1001 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 44 KiB/s wr, 23 op/s
Feb 23 10:02:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:43.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:02:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:02:45 np0005626463.localdomain podman[321004]: 2026-02-23 10:02:45.909008777 +0000 UTC m=+0.081897383 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:02:45 np0005626463.localdomain podman[321004]: 2026-02-23 10:02:45.99318957 +0000 UTC m=+0.166078136 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 23 10:02:46 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:02:46 np0005626463.localdomain podman[321005]: 2026-02-23 10:02:45.997492525 +0000 UTC m=+0.164267149 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 10:02:46 np0005626463.localdomain podman[321005]: 2026-02-23 10:02:46.080374438 +0000 UTC m=+0.247148992 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 10:02:46 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:02:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:46.120 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:46 np0005626463.localdomain ceph-mon[294160]: pgmap v486: 177 pgs: 177 active+clean; 197 MiB data, 1001 MiB used, 41 GiB / 42 GiB avail; 2.4 KiB/s rd, 43 KiB/s wr, 9 op/s
Feb 23 10:02:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0)
Feb 23 10:02:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 23 10:02:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Feb 23 10:02:46 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve49,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...)
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Feb 23 10:02:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:47 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:47Z|00347|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:47.696 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "format": "json"}]: dispatch
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ff275bf0-7ab0-4200-9dfe-d972931f7856", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: pgmap v487: 177 pgs: 177 active+clean; 197 MiB data, 1001 MiB used, 41 GiB / 42 GiB avail; 2.4 KiB/s rd, 56 KiB/s wr, 11 op/s
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 do_prune osdmap full prune enabled
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 e220: 6 total, 6 up, 6 in
Feb 23 10:02:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:02:48 np0005626463.localdomain dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 0 addresses
Feb 23 10:02:48 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host
Feb 23 10:02:48 np0005626463.localdomain systemd[1]: tmp-crun.bYJ5YE.mount: Deactivated successfully.
Feb 23 10:02:48 np0005626463.localdomain dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts
Feb 23 10:02:48 np0005626463.localdomain podman[321070]: 2026-02-23 10:02:48.752692315 +0000 UTC m=+0.073328593 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:02:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:48Z|00348|binding|INFO|Releasing lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf from this chassis (sb_readonly=0)
Feb 23 10:02:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:48.935 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:48 np0005626463.localdomain kernel: device tap6ea52883-11 left promiscuous mode
Feb 23 10:02:48 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:48Z|00349|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf down in Southbound
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.948 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ec4f16f-35f7-432c-b2c5-c12ffa9973a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=6ea52883-1131-4fad-9e9e-b739d54ff0bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.950 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6ea52883-1131-4fad-9e9e-b739d54ff0bf in datapath d54c9f86-28a1-4f1b-8617-dc63ba0e4fee unbound from our chassis
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.952 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:48.953 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[642d2195-6ec0-4b74-bfd4-9d3d02c05fdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:48.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:48.957 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:48 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:48.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:49 np0005626463.localdomain ceph-mon[294160]: osdmap e220: 6 total, 6 up, 6 in
Feb 23 10:02:49 np0005626463.localdomain dnsmasq[320925]: exiting on receipt of SIGTERM
Feb 23 10:02:49 np0005626463.localdomain podman[321111]: 2026-02-23 10:02:49.636651026 +0000 UTC m=+0.059323894 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 23 10:02:49 np0005626463.localdomain systemd[1]: libpod-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope: Deactivated successfully.
Feb 23 10:02:49 np0005626463.localdomain podman[321123]: 2026-02-23 10:02:49.696580129 +0000 UTC m=+0.049863887 container died dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 10:02:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b-userdata-shm.mount: Deactivated successfully.
Feb 23 10:02:49 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-a7bb5f63af1888222c2e9d63660f7e53fc4d9a2bdaccc92e1e2b415de8d1e37c-merged.mount: Deactivated successfully.
Feb 23 10:02:49 np0005626463.localdomain podman[321123]: 2026-02-23 10:02:49.780921588 +0000 UTC m=+0.134205306 container cleanup dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 23 10:02:49 np0005626463.localdomain systemd[1]: libpod-conmon-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope: Deactivated successfully.
Feb 23 10:02:49 np0005626463.localdomain podman[321130]: 2026-02-23 10:02:49.807789552 +0000 UTC m=+0.151102607 container remove dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0)
Feb 23 10:02:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:49.831 265541 INFO neutron.agent.dhcp.agent [None req-7423510d-ec8a-4e02-ac6f-9b7ca928647d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:49 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:49.890 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:50 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:50Z|00350|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:50.221 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "format": "json"}]: dispatch
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: pgmap v489: 177 pgs: 177 active+clean; 197 MiB data, 1001 MiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 32 KiB/s wr, 4 op/s
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:50 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2dd54c9f86\x2d28a1\x2d4f1b\x2d8617\x2ddc63ba0e4fee.mount: Deactivated successfully.
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "format": "json"}]: dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9a6060c0-ab8f-4a4f-a7a8-36c764fc6c11", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9a6060c0-ab8f-4a4f-a7a8-36c764fc6c11", "format": "json"}]: dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: pgmap v490: 177 pgs: 177 active+clean; 197 MiB data, 1002 MiB used, 41 GiB / 42 GiB avail; 6.0 KiB/s rd, 57 KiB/s wr, 15 op/s
Feb 23 10:02:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:51 np0005626463.localdomain podman[321153]: 2026-02-23 10:02:51.922768695 +0000 UTC m=+0.095993327 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:51 np0005626463.localdomain podman[321153]: 2026-02-23 10:02:51.961786269 +0000 UTC m=+0.135010861 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216)
Feb 23 10:02:51 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:02:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:52Z|00351|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:52.072 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:52 np0005626463.localdomain podman[321190]: 2026-02-23 10:02:52.638811952 +0000 UTC m=+0.061462891 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 10:02:52 np0005626463.localdomain dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 0 addresses
Feb 23 10:02:52 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host
Feb 23 10:02:52 np0005626463.localdomain dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:02:52 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:02:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:52Z|00352|binding|INFO|Releasing lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 from this chassis (sb_readonly=0)
Feb 23 10:02:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:52.807 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:52 np0005626463.localdomain kernel: device tap66dde4f3-68 left promiscuous mode
Feb 23 10:02:52 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:52Z|00353|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 down in Southbound
Feb 23 10:02:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:52.820 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7d058c2-7406-4067-bd55-39030093b520, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=66dde4f3-68ac-48ac-be4a-678a62b364e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:52.822 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 66dde4f3-68ac-48ac-be4a-678a62b364e3 in datapath 98deef06-e9d3-4399-8238-57fb5d318b61 unbound from our chassis
Feb 23 10:02:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:52.823 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 98deef06-e9d3-4399-8238-57fb5d318b61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:52 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:52.824 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a5ed11-65f3-425c-a855-85a93d657b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:52 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:52.828 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:53 np0005626463.localdomain podman[321229]: 2026-02-23 10:02:53.096390862 +0000 UTC m=+0.059649824 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 23 10:02:53 np0005626463.localdomain dnsmasq[320707]: exiting on receipt of SIGTERM
Feb 23 10:02:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:02:53 np0005626463.localdomain systemd[1]: libpod-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope: Deactivated successfully.
Feb 23 10:02:53 np0005626463.localdomain podman[321245]: 2026-02-23 10:02:53.175078534 +0000 UTC m=+0.058796607 container died fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 10:02:53 np0005626463.localdomain podman[321245]: 2026-02-23 10:02:53.25709587 +0000 UTC m=+0.140813943 container cleanup fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:02:53 np0005626463.localdomain systemd[1]: libpod-conmon-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope: Deactivated successfully.
Feb 23 10:02:53 np0005626463.localdomain podman[321244]: 2026-02-23 10:02:53.279531315 +0000 UTC m=+0.162106013 container remove fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2)
Feb 23 10:02:53 np0005626463.localdomain podman[321256]: 2026-02-23 10:02:53.233268691 +0000 UTC m=+0.103128749 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 23 10:02:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:53.301 265541 INFO neutron.agent.dhcp.agent [None req-1ee41c73-7427-4721-afaa-d17654462885 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:53 np0005626463.localdomain podman[321256]: 2026-02-23 10:02:53.315224456 +0000 UTC m=+0.185084504 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:02:53 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:53.316 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:02:53 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:02:53 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:53Z|00354|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:02:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:53.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 do_prune osdmap full prune enabled
Feb 23 10:02:53 np0005626463.localdomain ceph-mon[294160]: pgmap v491: 177 pgs: 177 active+clean; 197 MiB data, 1002 MiB used, 41 GiB / 42 GiB avail; 6.5 KiB/s rd, 56 KiB/s wr, 15 op/s
Feb 23 10:02:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e221 e221: 6 total, 6 up, 6 in
Feb 23 10:02:53 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in
Feb 23 10:02:53 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:53.975 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-9690d5c0a30798f94ac43c813648f7fb8820202cf10f3cc9168198333355fba5-merged.mount: Deactivated successfully.
Feb 23 10:02:54 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0-userdata-shm.mount: Deactivated successfully.
Feb 23 10:02:54 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d98deef06\x2de9d3\x2d4399\x2d8238\x2d57fb5d318b61.mount: Deactivated successfully.
Feb 23 10:02:54 np0005626463.localdomain ceph-mon[294160]: osdmap e221: 6 total, 6 up, 6 in
Feb 23 10:02:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "snap_name": "a9baadce-a22e-41ca-bf91-5533058fa60f", "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:02:55 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:02:55 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:55.327 265541 INFO neutron.agent.linux.ip_lib [None req-b28e6ae9-386e-4e32-8d1d-2e2e7f5f33ab - - - - - -] Device tap76d8ca35-ba cannot be used as it has no MAC address
Feb 23 10:02:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:55.400 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:55 np0005626463.localdomain kernel: device tap76d8ca35-ba entered promiscuous mode
Feb 23 10:02:55 np0005626463.localdomain NetworkManager[5974]: <info>  [1771840975.4076] manager: (tap76d8ca35-ba): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Feb 23 10:02:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:55.411 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:55 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:55Z|00355|binding|INFO|Claiming lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d for this chassis.
Feb 23 10:02:55 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:55Z|00356|binding|INFO|76d8ca35-bae5-4d5e-827a-3f91e4067a0d: Claiming unknown
Feb 23 10:02:55 np0005626463.localdomain systemd-udevd[321301]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:02:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:55.420 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a24474b213514491beaa97b54bfd695f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e0ff5c-1777-4eef-b7ef-c7cac136f27e, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=76d8ca35-bae5-4d5e-827a-3f91e4067a0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:02:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:55.422 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 76d8ca35-bae5-4d5e-827a-3f91e4067a0d in datapath 255d578f-65f8-4643-b21d-1ec8d68e886d bound to our chassis
Feb 23 10:02:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:55.424 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 255d578f-65f8-4643-b21d-1ec8d68e886d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:02:55.425 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b58cd471-011e-478d-9fe7-8c391fef9a92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:02:55 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:55Z|00357|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d ovn-installed in OVS
Feb 23 10:02:55 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:02:55Z|00358|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d up in Southbound
Feb 23 10:02:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:55.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:55.497 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:55.525 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9a6060c0-ab8f-4a4f-a7a8-36c764fc6c11", "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9a6060c0-ab8f-4a4f-a7a8-36c764fc6c11", "force": true, "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: pgmap v493: 177 pgs: 177 active+clean; 197 MiB data, 1002 MiB used, 41 GiB / 42 GiB avail; 8.1 KiB/s rd, 51 KiB/s wr, 17 op/s
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:02:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.145 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdf7cb8-a976-4e03-94f5-494e706a1b4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.148214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3571dd4-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'a0cd51b5e4954fa65f7d95ee08514dbbc9b78f09af51e68e7f694de73806d46c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.148214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3573ba2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '6acc43618268c804f827c5fa4826f895ed11d32d8b5fd1716146a78541120942'}]}, 'timestamp': '2026-02-23 10:02:56.178529', '_unique_id': 'c9cfea9bff464815ae1e553d92844893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.188 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43cbea76-8aa7-41fb-9e42-df147f6c755f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.185449', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd358dca0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'ad8cf752041775708432b4a9f6356e7e30257c5154a6f608ab036541c5366d8c'}]}, 'timestamp': '2026-02-23 10:02:56.189246', '_unique_id': '508721252e65430fa5fa85508fefbd36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14fe9efe-8cec-4dba-9809-a3ea031b8853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.192110', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd359613e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'ca2ebc0f119fc0d09f925ed09a2a28492c2e26cd9e2959a398dfcd7f2d1c04ba'}]}, 'timestamp': '2026-02-23 10:02:56.192573', '_unique_id': 'fe11d7b5a819423c8b1f17ad8e182e27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a14eddc-3143-45eb-b4d7-402591b39b06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.194943', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd359d074-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '31b9eb376ba0cbcbbc81d3c34497f3cd784a1c629bd16f7a8eec4a95428e20b4'}]}, 'timestamp': '2026-02-23 10:02:56.195418', '_unique_id': 'b033fb7ab651476095445ca9676e1a50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a10e779f-40f9-4e46-a176-11c5f44176a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.197481', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd35a3294-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '76515aa2fee1d97193b5d2dd6579aeac4a694f26d7dd23e8ac3773be3b235e44'}]}, 'timestamp': '2026-02-23 10:02:56.197957', '_unique_id': '10403af646924f91b45751c7ffba1a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.212 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efcbf453-010f-4e90-86a8-56fadc6755ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.200415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd35c7e46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'd8fc60cea31083e833ab74b995f1a1de7984070c2e1d4f8aed6a54bb06047e58'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.200415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd35c9408-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'da3c13a1c5e61f01dfe43b3c3e60431c8440a41287851a95970dbebed6aa52e1'}]}, 'timestamp': '2026-02-23 10:02:56.213533', '_unique_id': '955ea825f13743c984abc9a1a5c2fb78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc7a3ce4-7267-4093-af9c-d64bb5938198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.216858', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd35d2abc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '2ae6fcf75e455b02fe7e3af6a600c905cc79d09cdc8fc3d1b3df6a7e6b3279c9'}]}, 'timestamp': '2026-02-23 10:02:56.217428', '_unique_id': '3d4aacfa14554a0aa0cf411899facf1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.219 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b973a784-160e-4f9c-ae51-33fb7fc49351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:02:56.219695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd35ffb84-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.42464927, 'message_signature': 'ecb865c306313e265f96014b66b326e0f275b77db54565fc74839bdae8ca34a8'}]}, 'timestamp': '2026-02-23 10:02:56.235927', '_unique_id': '7e4a55183fc84cda826d07926fa75c59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ddde6fe-8cf3-42a6-8283-77b08fc9b52b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.238390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3607230-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '71de84f98e2d8abe39ca4046b60bf1e89338c655749661b9b2d0ab2cefaf5c7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.238390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd360854a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '5ed6ab7fc13a55558124e060455e244dafe877584591848b2bb3decdb8b6fd17'}]}, 'timestamp': '2026-02-23 10:02:56.239369', '_unique_id': '1f4f22fd9794477db04d1d7094407f22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 15480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ac3b462-604a-46f3-ac76-1b5696358c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15480000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:02:56.241766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd360f5fc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.42464927, 'message_signature': '572ec925621cd95ece6559dafea91e5ca47ae6749aa1878ff111e0f218f6fff5'}]}, 'timestamp': '2026-02-23 10:02:56.242241', '_unique_id': '27b95adbb76447ddb9c154021b80cc26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df1b5b4a-27aa-4521-8eff-05721a8aaf27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.244349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3615966-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': '0ac2e4fab49bc1e4d78b7a0c3f153a1b6178511e97bdd5c72e5009dbf57cfd2a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.244349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3616a8c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': '1e1335aba2dc594545e8b1579adaa677bf90f996cb1da4b45c7071692237ac60'}]}, 'timestamp': '2026-02-23 10:02:56.245206', '_unique_id': '32d9ce3748e34727b984438d30c56aea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63cc1be6-9465-4d51-93c7-6525eea786c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.247334', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd361ce46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'bbeeca9a0b363e2785cd29fbdf531af0220d22cb7977e94b85a9cb73a7630422'}]}, 'timestamp': '2026-02-23 10:02:56.247787', '_unique_id': '79489717e6904685ae1a2b7e697f4f21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dcf3bc7-9eb6-4cb1-96f9-f07407ce3cb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.250069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3623a2a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'ea01cd254ade0b1af95735d91743ae085fcbcb15f5dfc25eb3896b11a8d2ac7f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.250069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3624ace-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'cbaf51a6080dab767a54516cb0c568a0156fc835c10b8465f7de1fd236b1cf37'}]}, 'timestamp': '2026-02-23 10:02:56.251011', '_unique_id': '83e8109dc551400f94247b888a56b785'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d372c91-d7cd-4425-877e-058a41d346e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.253261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd362b64e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'fa2ebf87d0d2b3803513d0cea1b589406daacc41a4e9e29f28213cbb35348cfd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.253261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd362c7d8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'c8406a2536e3aa81afd8a03f3c49ecb6a5aa62c7d65cd1aa3e73bebfef152be5'}]}, 'timestamp': '2026-02-23 10:02:56.254254', '_unique_id': 'ab928cbf063f47d8a242f5ea9cbfa810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd90809da-2060-4729-80a0-908a50c6d945', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.256083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd36320c0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'dff518270c1a8239c5a7a33510a7298c64abc8a57e6eec293cca92e745ddcc43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.256083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3632a98-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '1a4b468e0dcce1e3fcfe3e13084d1e00ee58a9519b1c50d63a83f4ee2856bb85'}]}, 'timestamp': '2026-02-23 10:02:56.256602', '_unique_id': 'e3893ca2419b4c4bb814ce9573fb5f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a93b2715-e474-4583-83ed-12f86decf3b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.258071', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd3636ec2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'cca4879a5c692d1855cf2d6868d461ccb0312b5c7b1e1ee576f52c1e7e84f334'}]}, 'timestamp': '2026-02-23 10:02:56.258366', '_unique_id': 'b8ad34c3a0aa4cbd9135afd52cea9905'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd523ec54-bac3-45d2-8a06-84f5a7a7071b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.259684', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd363b5c6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '6d58e21173d4ba46775558db7988d7bef82b67e249f2574dafb99d514e2c00f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.259684', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd363c12e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'ca07201e9789a2900b8d96575c1332636d96ea2d3c659a576b6bcc536969c389'}]}, 'timestamp': '2026-02-23 10:02:56.260459', '_unique_id': 'ff0ba07f924a45d9bc8c63a92e7ce8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6df9049-00cb-4b43-9fd2-ecda04f9da4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.261852', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd3640328-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'd164512031e3245861ed1b7746edbe7d92eb60797457e5e3d16cc4c32a03e508'}]}, 'timestamp': '2026-02-23 10:02:56.262166', '_unique_id': 'e30e6e11838c435783abf4b0658dfc86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '121b8d96-e3e1-4fc1-9553-55368b3a7103', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.263479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd36441a8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '0668e5d4f46e84fa6f1b304de3c5ffcd29c0b87d5204c6af148cebd62eb50c5b'}]}, 'timestamp': '2026-02-23 10:02:56.263764', '_unique_id': '89d86ba75df04dc18092da5c7edea324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df63c6a8-40c8-49f8-9c1d-5270cfc0b67b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.265130', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd36483ac-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'f0e9c6878060d2b1461f306d5810a51e38bf390d02a0c828682ad8becaa8afe1'}]}, 'timestamp': '2026-02-23 10:02:56.265459', '_unique_id': '1f75408ebb1a440888f2f6bbdcfc996f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.267 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b93d42d-28f5-4295-b2cd-0b9bd7d37a2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.266787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd364c498-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '440d827f36cfdd0f80599110373a761b050019d79348cfa4ee0342454664a9de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.266787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd364cf74-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'dd4bf841624c479d1d1460d930c2c4cf2e31c2404db385c692704545b93178ec'}]}, 'timestamp': '2026-02-23 10:02:56.267379', '_unique_id': 'cd78a33affac4591be71e707c60a3452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:02:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:02:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:02:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:56 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:02:56 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:56 np0005626463.localdomain podman[321354]: 
Feb 23 10:02:56 np0005626463.localdomain podman[321354]: 2026-02-23 10:02:56.399131837 +0000 UTC m=+0.085463084 container create 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:02:56 np0005626463.localdomain systemd[1]: Started libpod-conmon-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope.
Feb 23 10:02:56 np0005626463.localdomain systemd[1]: tmp-crun.EcdjRn.mount: Deactivated successfully.
Feb 23 10:02:56 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:02:56 np0005626463.localdomain podman[321354]: 2026-02-23 10:02:56.358803581 +0000 UTC m=+0.045134858 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:02:56 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da48425f8f699e12a206073870f9c35f47d4c0c8337a2b7589ceef2e3b973683/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:02:56 np0005626463.localdomain podman[321354]: 2026-02-23 10:02:56.471430658 +0000 UTC m=+0.157761955 container init 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:02:56 np0005626463.localdomain podman[321354]: 2026-02-23 10:02:56.480526384 +0000 UTC m=+0.166857641 container start 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:02:56 np0005626463.localdomain dnsmasq[321373]: started, version 2.85 cachesize 150
Feb 23 10:02:56 np0005626463.localdomain dnsmasq[321373]: DNS service limited to local subnets
Feb 23 10:02:56 np0005626463.localdomain dnsmasq[321373]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:02:56 np0005626463.localdomain dnsmasq[321373]: warning: no upstream servers configured
Feb 23 10:02:56 np0005626463.localdomain dnsmasq-dhcp[321373]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 10:02:56 np0005626463.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 0 addresses
Feb 23 10:02:56 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host
Feb 23 10:02:56 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts
Feb 23 10:02:56 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:56.612 265541 INFO neutron.agent.dhcp.agent [None req-32fbde91-026b-45f8-a38b-a43fcbce574f - - - - - -] DHCP configuration for ports {'7a54f1f1-bb03-42b1-8518-feb534fb6ef9'} is completed
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e221 do_prune osdmap full prune enabled
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 e222: 6 total, 6 up, 6 in
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "format": "json"}]: dispatch
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:57 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:57.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:02:58 np0005626463.localdomain ceph-mon[294160]: osdmap e222: 6 total, 6 up, 6 in
Feb 23 10:02:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "snap_name": "a9baadce-a22e-41ca-bf91-5533058fa60f", "target_sub_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "format": "json"}]: dispatch
Feb 23 10:02:58 np0005626463.localdomain ceph-mon[294160]: pgmap v495: 177 pgs: 177 active+clean; 198 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 122 KiB/s wr, 91 op/s
Feb 23 10:02:58 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:58.280 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:57Z, description=, device_id=f2f8a48a-d698-4e75-9a87-03e5d2e9729d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28293698e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829369eb0>], id=371d162a-b0cd-4d91-9745-94e1cb8296bd, ip_allocation=immediate, mac_address=fa:16:3e:c4:bf:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:53Z, description=, dns_domain=, id=255d578f-65f8-4643-b21d-1ec8d68e886d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1698092807-network, port_security_enabled=True, project_id=a24474b213514491beaa97b54bfd695f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57786, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3274, status=ACTIVE, subnets=['45a769dd-5712-4113-814d-773e2ed0a09c'], tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:54Z, vlan_transparent=None, network_id=255d578f-65f8-4643-b21d-1ec8d68e886d, port_security_enabled=False, project_id=a24474b213514491beaa97b54bfd695f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3287, status=DOWN, tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:58Z on network 255d578f-65f8-4643-b21d-1ec8d68e886d
Feb 23 10:02:58 np0005626463.localdomain systemd[1]: tmp-crun.eExSE7.mount: Deactivated successfully.
Feb 23 10:02:58 np0005626463.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 1 addresses
Feb 23 10:02:58 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host
Feb 23 10:02:58 np0005626463.localdomain podman[321391]: 2026-02-23 10:02:58.911898643 +0000 UTC m=+0.461578897 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 10:02:58 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts
Feb 23 10:02:58 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:02:58.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:02:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:59.203 265541 INFO neutron.agent.dhcp.agent [None req-9bc4a8b4-f693-4fe5-b1db-532a094aa1dd - - - - - -] DHCP configuration for ports {'371d162a-b0cd-4d91-9745-94e1cb8296bd'} is completed
Feb 23 10:02:59 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e48: np0005626465.hlpkwo(active, since 11m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:02:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "format": "json"}]: dispatch
Feb 23 10:02:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:02:59 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/475540389' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:02:59 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/475540389' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:02:59 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:02:59.819 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:57Z, description=, device_id=f2f8a48a-d698-4e75-9a87-03e5d2e9729d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28290fbc10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28290fbf40>], id=371d162a-b0cd-4d91-9745-94e1cb8296bd, ip_allocation=immediate, mac_address=fa:16:3e:c4:bf:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:53Z, description=, dns_domain=, id=255d578f-65f8-4643-b21d-1ec8d68e886d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1698092807-network, port_security_enabled=True, project_id=a24474b213514491beaa97b54bfd695f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57786, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3274, status=ACTIVE, subnets=['45a769dd-5712-4113-814d-773e2ed0a09c'], tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:54Z, vlan_transparent=None, network_id=255d578f-65f8-4643-b21d-1ec8d68e886d, port_security_enabled=False, project_id=a24474b213514491beaa97b54bfd695f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3287, status=DOWN, tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:58Z on network 255d578f-65f8-4643-b21d-1ec8d68e886d
Feb 23 10:03:00 np0005626463.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 1 addresses
Feb 23 10:03:00 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host
Feb 23 10:03:00 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts
Feb 23 10:03:00 np0005626463.localdomain podman[321431]: 2026-02-23 10:03:00.037845373 +0000 UTC m=+0.067767149 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:00 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:03:00.254 265541 INFO neutron.agent.dhcp.agent [None req-50ec6260-869f-47ce-bb87-d1f6423194c0 - - - - - -] DHCP configuration for ports {'371d162a-b0cd-4d91-9745-94e1cb8296bd'} is completed
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: mgrmap e48: np0005626465.hlpkwo(active, since 11m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: pgmap v496: 177 pgs: 177 active+clean; 198 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 71 KiB/s wr, 76 op/s
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 do_prune osdmap full prune enabled
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 e223: 6 total, 6 up, 6 in
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in
Feb 23 10:03:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:01.614 2 INFO neutron.agent.securitygroups_rpc [None req-b3584fe0-66dc-4b0f-aa46-a4b6d8c2c0ea a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['26548beb-0e57-409b-96fc-150c1ca0653f']
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:01 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:01 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:01.817 2 INFO neutron.agent.securitygroups_rpc [None req-f7ece8f8-128a-45ae-8798-1ff174ac4586 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['26548beb-0e57-409b-96fc-150c1ca0653f']
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "format": "json"}]: dispatch
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "snap_name": "37d3cec1-9f1f-45f8-814b-ceabcde60a0c", "format": "json"}]: dispatch
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: pgmap v497: 177 pgs: 177 active+clean; 198 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 132 KiB/s wr, 118 op/s
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: osdmap e223: 6 total, 6 up, 6 in
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:02.440 2 INFO neutron.agent.securitygroups_rpc [None req-0700196d-2b57-4e61-9046-6185a324d5af a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:02.574 2 INFO neutron.agent.securitygroups_rpc [None req-26370991-3c47-4f17-8c42-2e91914f0a91 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:02.718 2 INFO neutron.agent.securitygroups_rpc [None req-0212ca46-343b-4ade-95ab-95bdce90a790 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:03:02 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:03:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:02.845 2 INFO neutron.agent.securitygroups_rpc [None req-bb2604e0-3280-4b44-93fb-d763a31d90c5 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:02 np0005626463.localdomain podman[321453]: 2026-02-23 10:03:02.921559599 +0000 UTC m=+0.084287298 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:03:02 np0005626463.localdomain podman[321452]: 2026-02-23 10:03:02.964731274 +0000 UTC m=+0.130052044 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.expose-services=)
Feb 23 10:03:02 np0005626463.localdomain podman[321452]: 2026-02-23 10:03:02.982208783 +0000 UTC m=+0.147529583 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=)
Feb 23 10:03:02 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:02.997 2 INFO neutron.agent.securitygroups_rpc [None req-c9497ba8-e7cb-4ffb-8fd8-5a9e080024dc a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:02 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:03:03 np0005626463.localdomain podman[321453]: 2026-02-23 10:03:03.035732174 +0000 UTC m=+0.198459913 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:03:03 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:03:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:03.120 2 INFO neutron.agent.securitygroups_rpc [None req-d0ec72bc-8566-4325-bbab-9dfaee70a365 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "snap_name": "37d3cec1-9f1f-45f8-814b-ceabcde60a0c", "target_sub_name": "77119be1-a395-4613-8428-4049b6a55ee4", "format": "json"}]: dispatch
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "77119be1-a395-4613-8428-4049b6a55ee4", "format": "json"}]: dispatch
Feb 23 10:03:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:03.443 2 INFO neutron.agent.securitygroups_rpc [None req-ddb1651b-3ac2-4a8f-8977-00c142537a52 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:03:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:03.600 2 INFO neutron.agent.securitygroups_rpc [None req-907a0d0b-deea-47b0-a1db-b229494a62fe a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:03.781 2 INFO neutron.agent.securitygroups_rpc [None req-bbab3356-bbf6-432a-a4a9-3be3927ef59f a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:03 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:03.960 2 INFO neutron.agent.securitygroups_rpc [None req-a3d9205c-35b8-4d92-aa27-e429ab8dbfe2 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']
Feb 23 10:03:03 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:03.979 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: pgmap v499: 177 pgs: 177 active+clean; 198 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 134 KiB/s wr, 124 op/s
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 do_prune osdmap full prune enabled
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 e224: 6 total, 6 up, 6 in
Feb 23 10:03:04 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in
Feb 23 10:03:04 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:04.636 2 INFO neutron.agent.securitygroups_rpc [None req-daa2f3e5-708b-419e-b743-0c985d38df1b a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['087f2ead-29df-46bd-b356-e193c3a6c3a1']
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: osdmap e224: 6 total, 6 up, 6 in
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "snap_name": "9df5708b-0dc0-46f9-a0f4-8a66460e11a4", "format": "json"}]: dispatch
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: pgmap v501: 177 pgs: 177 active+clean; 198 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 64 KiB/s wr, 50 op/s
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:03:05 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:05 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:05.781 2 INFO neutron.agent.securitygroups_rpc [None req-931492cc-c107-4f00-af23-020a7b1bfb4a a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['33c4ddfa-59ae-40a7-8c2f-cf1ffe09eb9f']
Feb 23 10:03:05 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:05.920 2 INFO neutron.agent.securitygroups_rpc [None req-8d452d14-dd60-417e-87bd-a7a2372759de a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['33c4ddfa-59ae-40a7-8c2f-cf1ffe09eb9f']
Feb 23 10:03:06 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:06 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:06.751 2 INFO neutron.agent.securitygroups_rpc [None req-43388687-9a00-4563-af08-790772899ea4 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['e9d6e743-53d6-4e9c-950f-ebadc1a82c0f']
Feb 23 10:03:06 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:06.871 2 INFO neutron.agent.securitygroups_rpc [None req-8bc11e13-0374-4d0f-a420-dc9bed7775c4 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['e9d6e743-53d6-4e9c-950f-ebadc1a82c0f']
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 do_prune osdmap full prune enabled
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 e225: 6 total, 6 up, 6 in
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: pgmap v502: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 115 KiB/s wr, 137 op/s
Feb 23 10:03:07 np0005626463.localdomain ceph-mon[294160]: osdmap e225: 6 total, 6 up, 6 in
Feb 23 10:03:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:07.697 2 INFO neutron.agent.securitygroups_rpc [None req-7032d099-2d57-4911-980c-bd2a477e3e37 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:07 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:07.905 2 INFO neutron.agent.securitygroups_rpc [None req-09a1ef76-51b7-437b-b37e-bb449ab05579 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:08 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:08.079 2 INFO neutron.agent.securitygroups_rpc [None req-6957b0df-fd0d-456d-b289-5c8ae6b2d3ab a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:08 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:08.358 2 INFO neutron.agent.securitygroups_rpc [None req-2e64ea36-7ced-48d5-9b78-d6c2b3b8afa9 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:08 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:03:08Z|00359|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:03:08 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:08.645 2 INFO neutron.agent.securitygroups_rpc [None req-ac68416c-3f8b-4a61-93f4-c722d3303f27 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:08.655 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:08 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:08.908 2 INFO neutron.agent.securitygroups_rpc [None req-b0017ab9-a450-4286-8fae-6ccd40fd4966 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']
Feb 23 10:03:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:08.982 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:03:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:03:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:03:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 23 10:03:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:03:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19298 "" "Go-http-client/1.1"
Feb 23 10:03:09 np0005626463.localdomain neutron_sriov_agent[258207]: 2026-02-23 10:03:09.745 2 INFO neutron.agent.securitygroups_rpc [None req-b5ba1047-11d6-4902-ae19-bf3c18cfb931 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['9ad178d0-3a41-40dd-be58-0e7ebb53d59d']
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: pgmap v504: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 53 KiB/s wr, 93 op/s
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:10 np0005626463.localdomain podman[321510]: 2026-02-23 10:03:10.44413568 +0000 UTC m=+0.060759320 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:03:10 np0005626463.localdomain dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 0 addresses
Feb 23 10:03:10 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host
Feb 23 10:03:10 np0005626463.localdomain dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts
Feb 23 10:03:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:10.696 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:10 np0005626463.localdomain kernel: device tap76d8ca35-ba left promiscuous mode
Feb 23 10:03:10 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:03:10Z|00360|binding|INFO|Releasing lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d from this chassis (sb_readonly=0)
Feb 23 10:03:10 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:03:10Z|00361|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d down in Southbound
Feb 23 10:03:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:10.706 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a24474b213514491beaa97b54bfd695f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e0ff5c-1777-4eef-b7ef-c7cac136f27e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=76d8ca35-bae5-4d5e-827a-3f91e4067a0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:03:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:10.708 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 76d8ca35-bae5-4d5e-827a-3f91e4067a0d in datapath 255d578f-65f8-4643-b21d-1ec8d68e886d unbound from our chassis
Feb 23 10:03:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:10.710 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 255d578f-65f8-4643-b21d-1ec8d68e886d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:03:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:10.711 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[59af9dfb-28b9-4bda-80b5-f67fee184575]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:03:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:10.723 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:11 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:03:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:03:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:11 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:03:12Z|00362|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:03:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:12.268 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: pgmap v505: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 99 KiB/s wr, 91 op/s
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 do_prune osdmap full prune enabled
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e226 e226: 6 total, 6 up, 6 in
Feb 23 10:03:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in
Feb 23 10:03:12 np0005626463.localdomain podman[321551]: 2026-02-23 10:03:12.593405359 +0000 UTC m=+0.066357004 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:03:12 np0005626463.localdomain dnsmasq[321373]: exiting on receipt of SIGTERM
Feb 23 10:03:12 np0005626463.localdomain systemd[1]: libpod-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope: Deactivated successfully.
Feb 23 10:03:12 np0005626463.localdomain podman[321564]: 2026-02-23 10:03:12.67971706 +0000 UTC m=+0.066352085 container died 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 10:03:12 np0005626463.localdomain podman[321564]: 2026-02-23 10:03:12.714853004 +0000 UTC m=+0.101487989 container cleanup 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 10:03:12 np0005626463.localdomain systemd[1]: libpod-conmon-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope: Deactivated successfully.
Feb 23 10:03:12 np0005626463.localdomain podman[321565]: 2026-02-23 10:03:12.757207774 +0000 UTC m=+0.143194748 container remove 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:03:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:03:12.791 265541 INFO neutron.agent.dhcp.agent [None req-20033a4e-e604-41c3-bc6e-9c974713406a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:03:12 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:03:12.795 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:03:13 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 23 10:03:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:03:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:03:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:03:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:03:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:03:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:03:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f6c95aca-bd80-4d48-87fc-8bc6f3370a1f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f6c95aca-bd80-4d48-87fc-8bc6f3370a1f", "format": "json"}]: dispatch
Feb 23 10:03:13 np0005626463.localdomain ceph-mon[294160]: osdmap e226: 6 total, 6 up, 6 in
Feb 23 10:03:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-da48425f8f699e12a206073870f9c35f47d4c0c8337a2b7589ceef2e3b973683-merged.mount: Deactivated successfully.
Feb 23 10:03:13 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2-userdata-shm.mount: Deactivated successfully.
Feb 23 10:03:13 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d255d578f\x2d65f8\x2d4643\x2db21d\x2d1ec8d68e886d.mount: Deactivated successfully.
Feb 23 10:03:13 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:13.985 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "snap_name": "9df5708b-0dc0-46f9-a0f4-8a66460e11a4_90e57bd5-ed99-4502-a6ff-31eb33e53be8", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "snap_name": "9df5708b-0dc0-46f9-a0f4-8a66460e11a4", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: pgmap v507: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 99 KiB/s wr, 92 op/s
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cbf0f0f5-8772-4c98-89a6-f866d6a9652c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cbf0f0f5-8772-4c98-89a6-f866d6a9652c", "format": "json"}]: dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: pgmap v508: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 48 KiB/s wr, 5 op/s
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e226 do_prune osdmap full prune enabled
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 e227: 6 total, 6 up, 6 in
Feb 23 10:03:15 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in
Feb 23 10:03:16 np0005626463.localdomain ceph-mon[294160]: osdmap e227: 6 total, 6 up, 6 in
Feb 23 10:03:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "format": "json"}]: dispatch
Feb 23 10:03:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c2e06815-30f9-4d3c-bbe4-f2d82eac2683", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:03:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:03:16 np0005626463.localdomain systemd[1]: tmp-crun.AKQFTc.mount: Deactivated successfully.
Feb 23 10:03:16 np0005626463.localdomain podman[321592]: 2026-02-23 10:03:16.947218283 +0000 UTC m=+0.119123102 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 23 10:03:17 np0005626463.localdomain podman[321593]: 2026-02-23 10:03:17.015175938 +0000 UTC m=+0.183767863 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:03:17 np0005626463.localdomain podman[321593]: 2026-02-23 10:03:17.028623689 +0000 UTC m=+0.197215664 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:03:17 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:03:17 np0005626463.localdomain podman[321592]: 2026-02-23 10:03:17.054690609 +0000 UTC m=+0.226595438 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 10:03:17 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: pgmap v510: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 123 KiB/s wr, 14 op/s
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:17 np0005626463.localdomain systemd[1]: tmp-crun.P35pO1.mount: Deactivated successfully.
Feb 23 10:03:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "format": "json"}]: dispatch
Feb 23 10:03:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:18.987 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:03:19 np0005626463.localdomain ceph-mon[294160]: pgmap v511: 177 pgs: 177 active+clean; 199 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 75 KiB/s wr, 10 op/s
Feb 23 10:03:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "format": "json"}]: dispatch
Feb 23 10:03:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "77119be1-a395-4613-8428-4049b6a55ee4", "format": "json"}]: dispatch
Feb 23 10:03:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 do_prune osdmap full prune enabled
Feb 23 10:03:21 np0005626463.localdomain ceph-mon[294160]: pgmap v512: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.6 KiB/s rd, 92 KiB/s wr, 14 op/s
Feb 23 10:03:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 e228: 6 total, 6 up, 6 in
Feb 23 10:03:21 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 do_prune osdmap full prune enabled
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e229 e229: 6 total, 6 up, 6 in
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in
Feb 23 10:03:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: osdmap e228: 6 total, 6 up, 6 in
Feb 23 10:03:22 np0005626463.localdomain ceph-mon[294160]: osdmap e229: 6 total, 6 up, 6 in
Feb 23 10:03:22 np0005626463.localdomain podman[321641]: 2026-02-23 10:03:22.908382241 +0000 UTC m=+0.085545618 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 23 10:03:22 np0005626463.localdomain podman[321641]: 2026-02-23 10:03:22.921991728 +0000 UTC m=+0.099155135 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 10:03:22 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:03:23 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:03:23 np0005626463.localdomain ceph-mon[294160]: pgmap v515: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 5.3 KiB/s rd, 107 KiB/s wr, 18 op/s
Feb 23 10:03:23 np0005626463.localdomain podman[321661]: 2026-02-23 10:03:23.915293914 +0000 UTC m=+0.083693940 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Feb 23 10:03:23 np0005626463.localdomain podman[321661]: 2026-02-23 10:03:23.949175398 +0000 UTC m=+0.117575424 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:03:23 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.991 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.993 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:03:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:23.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:25 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:25 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.115 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.115 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.116 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.116 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "77119be1-a395-4613-8428-4049b6a55ee4", "format": "json"}]: dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cbf0f0f5-8772-4c98-89a6-f866d6a9652c", "format": "json"}]: dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cbf0f0f5-8772-4c98-89a6-f866d6a9652c", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: pgmap v516: 177 pgs: 177 active+clean; 200 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 4.5 KiB/s rd, 27 KiB/s wr, 8 op/s
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d8b8020f-4ee6-469b-affa-207d2f0b4b2c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d8b8020f-4ee6-469b-affa-207d2f0b4b2c", "format": "json"}]: dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e229 do_prune osdmap full prune enabled
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 e230: 6 total, 6 up, 6 in
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.807 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.824 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.824 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.825 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:26.825 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "format": "json"}]: dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1730ed94-3e57-4cba-99cd-1fafcc2f97aa", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: osdmap e230: 6 total, 6 up, 6 in
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5fa488c6-5f64-4942-8c4d-23ef356045f5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5fa488c6-5f64-4942-8c4d-23ef356045f5", "format": "json"}]: dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "snap_name": "a9baadce-a22e-41ca-bf91-5533058fa60f_7d5a3e75-c497-45e3-a6f0-07253848d71b", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "snap_name": "a9baadce-a22e-41ca-bf91-5533058fa60f", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: pgmap v518: 177 pgs: 2 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 173 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 88 KiB/s wr, 88 op/s
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:28 np0005626463.localdomain sudo[321679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:03:28 np0005626463.localdomain sudo[321679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:03:28 np0005626463.localdomain sudo[321679]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:28 np0005626463.localdomain sudo[321697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 23 10:03:28 np0005626463.localdomain sudo[321697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:03:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:28.995 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9f8ed63b-f013-41f1-9020-ae061837f09e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9f8ed63b-f013-41f1-9020-ae061837f09e", "format": "json"}]: dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f6c95aca-bd80-4d48-87fc-8bc6f3370a1f", "format": "json"}]: dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f6c95aca-bd80-4d48-87fc-8bc6f3370a1f", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/485121767' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 10:03:29 np0005626463.localdomain sudo[321697]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 10:03:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:29 np0005626463.localdomain sudo[321737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:03:29 np0005626463.localdomain sudo[321737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:03:29 np0005626463.localdomain sudo[321737]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:29 np0005626463.localdomain sudo[321755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:03:29 np0005626463.localdomain sudo[321755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2335753205' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: pgmap v519: 177 pgs: 2 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 173 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 70 KiB/s wr, 71 op/s
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/26541569' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3668932612' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:30 np0005626463.localdomain sudo[321755]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 do_prune osdmap full prune enabled
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 e231: 6 total, 6 up, 6 in
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in
Feb 23 10:03:30 np0005626463.localdomain sudo[321805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:03:30 np0005626463.localdomain sudo[321805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:03:30 np0005626463.localdomain sudo[321805]: pam_unix(sudo:session): session closed for user root
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:03:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:03:31 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "5fa488c6-5f64-4942-8c4d-23ef356045f5", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "format": "json"}]: dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "20902d73-7434-438f-9b7e-d3fbd0c8aa20", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: osdmap e231: 6 total, 6 up, 6 in
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "564c85df-1944-453b-837f-3ce015c7f964", "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: pgmap v521: 177 pgs: 2 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 173 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 159 KiB/s wr, 93 op/s
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.284 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.284 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:03:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4145762043' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:32.739 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:03:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "format": "json"}]: dispatch
Feb 23 10:03:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9f8ed63b-f013-41f1-9020-ae061837f09e", "format": "json"}]: dispatch
Feb 23 10:03:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9f8ed63b-f013-41f1-9020-ae061837f09e", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/4145762043' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:03:33 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:03:33 np0005626463.localdomain podman[321845]: 2026-02-23 10:03:33.947380902 +0000 UTC m=+0.092641790 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:03:33 np0005626463.localdomain podman[321845]: 2026-02-23 10:03:33.992453427 +0000 UTC m=+0.137714235 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:33.999 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:34 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:03:34 np0005626463.localdomain podman[321846]: 2026-02-23 10:03:33.99605035 +0000 UTC m=+0.139055907 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:03:34 np0005626463.localdomain podman[321846]: 2026-02-23 10:03:34.078274563 +0000 UTC m=+0.221280080 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:03:34 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.230 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.230 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:03:34 np0005626463.localdomain ceph-mon[294160]: pgmap v522: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 160 KiB/s wr, 99 op/s
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.439 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11255MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.562 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.563 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.563 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:03:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:34.630 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:03:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4272332485' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:35.116 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:03:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:35.123 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:03:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:35.145 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:03:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:35.147 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:03:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:35.148 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "564c85df-1944-453b-837f-3ce015c7f964", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/4272332485' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:03:35 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:36.144 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:36.145 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:36.173 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:36.173 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:36.174 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5fa488c6-5f64-4942-8c4d-23ef356045f5", "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5fa488c6-5f64-4942-8c4d-23ef356045f5", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "snap_name": "55825af0-7142-4a33-bd75-4716e6819cc0", "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8ca6198b-7076-4a80-86fe-294603711c42", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8ca6198b-7076-4a80-86fe-294603711c42", "format": "json"}]: dispatch
Feb 23 10:03:36 np0005626463.localdomain ceph-mon[294160]: pgmap v523: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 89 KiB/s wr, 38 op/s
Feb 23 10:03:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 do_prune osdmap full prune enabled
Feb 23 10:03:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e232 e232: 6 total, 6 up, 6 in
Feb 23 10:03:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in
Feb 23 10:03:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: pgmap v524: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 142 KiB/s wr, 35 op/s
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: osdmap e232: 6 total, 6 up, 6 in
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:03:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:03:38 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:39 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:39.000 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:03:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:39.000 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:39 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:39.003 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:03:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:39.003 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e232 do_prune osdmap full prune enabled
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e233 e233: 6 total, 6 up, 6 in
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "c79a6bcf-2cb2-4f61-9af5-279ad7119641", "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:03:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:03:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:03:39 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in
Feb 23 10:03:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:03:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:03:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:03:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18833 "" "Go-http-client/1.1"
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e233 do_prune osdmap full prune enabled
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: pgmap v526: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 160 KiB/s wr, 39 op/s
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "snap_name": "55825af0-7142-4a33-bd75-4716e6819cc0_b42c9f23-4d47-4f3f-9852-b48bf40cff7c", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "snap_name": "55825af0-7142-4a33-bd75-4716e6819cc0", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: osdmap e233: 6 total, 6 up, 6 in
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 e234: 6 total, 6 up, 6 in
Feb 23 10:03:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8ca6198b-7076-4a80-86fe-294603711c42", "format": "json"}]: dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8ca6198b-7076-4a80-86fe-294603711c42", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: osdmap e234: 6 total, 6 up, 6 in
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "c79a6bcf-2cb2-4f61-9af5-279ad7119641", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8115884b-219c-4ac0-b085-66e7920ae15b", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8115884b-219c-4ac0-b085-66e7920ae15b", "format": "json"}]: dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: pgmap v529: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 191 KiB/s wr, 32 op/s
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:41 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 23 10:03:42 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:42.008 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 do_prune osdmap full prune enabled
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e235 e235: 6 total, 6 up, 6 in
Feb 23 10:03:42 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in
Feb 23 10:03:43 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:03:43Z|00363|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 10:03:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:03:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:03:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:03:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:03:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:03:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e235 do_prune osdmap full prune enabled
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 e236: 6 total, 6 up, 6 in
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "format": "json"}]: dispatch
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "06a68059-7c6e-4e80-9728-910ba3a5d245", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: osdmap e235: 6 total, 6 up, 6 in
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: pgmap v531: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 87 KiB/s wr, 52 op/s
Feb 23 10:03:43 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:44.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4dead0ae-6bab-4667-ac7c-3228f29e590a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4dead0ae-6bab-4667-ac7c-3228f29e590a", "format": "json"}]: dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: osdmap e236: 6 total, 6 up, 6 in
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "8115884b-219c-4ac0-b085-66e7920ae15b", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "46bba3de-0cfe-49be-8cb3-692c9a38d3b6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "46bba3de-0cfe-49be-8cb3-692c9a38d3b6", "format": "json"}]: dispatch
Feb 23 10:03:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:45 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:45 np0005626463.localdomain ceph-mon[294160]: pgmap v533: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 85 KiB/s wr, 51 op/s
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/357402712' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/357402712' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 do_prune osdmap full prune enabled
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e237 e237: 6 total, 6 up, 6 in
Feb 23 10:03:47 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in
Feb 23 10:03:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:03:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:03:47 np0005626463.localdomain podman[321912]: 2026-02-23 10:03:47.924451707 +0000 UTC m=+0.093763409 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 23 10:03:47 np0005626463.localdomain podman[321913]: 2026-02-23 10:03:47.975263918 +0000 UTC m=+0.141614349 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:03:47 np0005626463.localdomain podman[321912]: 2026-02-23 10:03:47.990361575 +0000 UTC m=+0.159673277 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Feb 23 10:03:48 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:03:48 np0005626463.localdomain podman[321913]: 2026-02-23 10:03:48.012437268 +0000 UTC m=+0.178787719 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:03:48 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4dead0ae-6bab-4667-ac7c-3228f29e590a", "format": "json"}]: dispatch
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4dead0ae-6bab-4667-ac7c-3228f29e590a", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: pgmap v534: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 177 KiB/s wr, 118 op/s
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8115884b-219c-4ac0-b085-66e7920ae15b", "format": "json"}]: dispatch
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: osdmap e237: 6 total, 6 up, 6 in
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:48.563 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:03:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:48.563 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:03:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:03:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:03:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:49.008 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8115884b-219c-4ac0-b085-66e7920ae15b", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "46bba3de-0cfe-49be-8cb3-692c9a38d3b6", "format": "json"}]: dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "46bba3de-0cfe-49be-8cb3-692c9a38d3b6", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e237 do_prune osdmap full prune enabled
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e238 e238: 6 total, 6 up, 6 in
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: pgmap v536: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 107 KiB/s wr, 76 op/s
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b28e5355-81f5-4d56-b72e-21e83f187f85", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: osdmap e238: 6 total, 6 up, 6 in
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b28e5355-81f5-4d56-b72e-21e83f187f85", "format": "json"}]: dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e238 do_prune osdmap full prune enabled
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7ad99e71-8300-4ae6-902e-ece59e9f7aad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7ad99e71-8300-4ae6-902e-ece59e9f7aad", "format": "json"}]: dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "718d23c9-8a91-4933-afe0-96e2eaaade86", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "718d23c9-8a91-4933-afe0-96e2eaaade86", "format": "json"}]: dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: pgmap v538: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 181 KiB/s wr, 74 op/s
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 e239: 6 total, 6 up, 6 in
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 do_prune osdmap full prune enabled
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 e240: 6 total, 6 up, 6 in
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in
Feb 23 10:03:52 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:52 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: osdmap e239: 6 total, 6 up, 6 in
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: osdmap e240: 6 total, 6 up, 6 in
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "format": "json"}]: dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:53 np0005626463.localdomain ceph-mon[294160]: pgmap v541: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 121 KiB/s wr, 43 op/s
Feb 23 10:03:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:03:53 np0005626463.localdomain podman[321956]: 2026-02-23 10:03:53.92010622 +0000 UTC m=+0.091792698 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Feb 23 10:03:53 np0005626463.localdomain podman[321956]: 2026-02-23 10:03:53.934307499 +0000 UTC m=+0.105994007 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 23 10:03:53 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:03:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:54.012 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "718d23c9-8a91-4933-afe0-96e2eaaade86", "format": "json"}]: dispatch
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "718d23c9-8a91-4933-afe0-96e2eaaade86", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3126635705' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3126635705' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:03:54 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:03:54 np0005626463.localdomain podman[321975]: 2026-02-23 10:03:54.922076827 +0000 UTC m=+0.088899060 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:54 np0005626463.localdomain podman[321975]: 2026-02-23 10:03:54.96131138 +0000 UTC m=+0.128133553 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 10:03:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:54 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b28e5355-81f5-4d56-b72e-21e83f187f85", "format": "json"}]: dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b28e5355-81f5-4d56-b72e-21e83f187f85", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7ad99e71-8300-4ae6-902e-ece59e9f7aad", "format": "json"}]: dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7ad99e71-8300-4ae6-902e-ece59e9f7aad", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:03:55 np0005626463.localdomain ceph-mon[294160]: pgmap v542: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 116 KiB/s wr, 41 op/s
Feb 23 10:03:56 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "snap_name": "bb21444e-b36b-4d43-bc43-885dd09ac729", "format": "json"}]: dispatch
Feb 23 10:03:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:03:57 np0005626463.localdomain ceph-mon[294160]: pgmap v543: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 212 KiB/s wr, 69 op/s
Feb 23 10:03:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:03:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:58 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:03:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:03:58 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.016 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.041 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:03:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:03:59.042 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "77119be1-a395-4613-8428-4049b6a55ee4", "format": "json"}]: dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "77119be1-a395-4613-8428-4049b6a55ee4", "force": true, "format": "json"}]: dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:03:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d04b36e0-00cb-49ab-bf81-b75179f44a78", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d04b36e0-00cb-49ab-bf81-b75179f44a78", "format": "json"}]: dispatch
Feb 23 10:04:00 np0005626463.localdomain ceph-mon[294160]: pgmap v544: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 106 KiB/s wr, 55 op/s
Feb 23 10:04:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "snap_name": "bb21444e-b36b-4d43-bc43-885dd09ac729_cedb6a9b-82fb-45a2-8d59-0fd8971a0ba8", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "snap_name": "bb21444e-b36b-4d43-bc43-885dd09ac729", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:01 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:01 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "snap_name": "37d3cec1-9f1f-45f8-814b-ceabcde60a0c_457e4fec-6656-4e8e-9d9d-436c4ef214e1", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "snap_name": "37d3cec1-9f1f-45f8-814b-ceabcde60a0c", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: pgmap v545: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 148 KiB/s wr, 33 op/s
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 do_prune osdmap full prune enabled
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e241 e241: 6 total, 6 up, 6 in
Feb 23 10:04:02 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in
Feb 23 10:04:03 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:04:03 np0005626463.localdomain ceph-mon[294160]: osdmap e241: 6 total, 6 up, 6 in
Feb 23 10:04:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e241 do_prune osdmap full prune enabled
Feb 23 10:04:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 e242: 6 total, 6 up, 6 in
Feb 23 10:04:03 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in
Feb 23 10:04:03 np0005626463.localdomain sshd[321994]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.043 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.086 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:04.087 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "format": "json"}]: dispatch
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c2d99fde-fd9e-46a6-885a-dbda121973f1", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: pgmap v547: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 132 KiB/s wr, 31 op/s
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: osdmap e242: 6 total, 6 up, 6 in
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/881987299' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/881987299' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.151637) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044151767, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2661, "num_deletes": 265, "total_data_size": 2833728, "memory_usage": 2921888, "flush_reason": "Manual Compaction"}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044172227, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2771005, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33119, "largest_seqno": 35779, "table_properties": {"data_size": 2759023, "index_size": 7591, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 30038, "raw_average_key_size": 22, "raw_value_size": 2733367, "raw_average_value_size": 2077, "num_data_blocks": 318, "num_entries": 1316, "num_filter_entries": 1316, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840925, "oldest_key_time": 1771840925, "file_creation_time": 1771841044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 20638 microseconds, and 8952 cpu microseconds.
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.172291) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2771005 bytes OK
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.172323) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174304) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174325) EVENT_LOG_v1 {"time_micros": 1771841044174319, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174360) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 2821520, prev total WAL file size 2821520, number of live WAL files 2.
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.175291) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2706KB)], [60(16MB)]
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044175346, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20160651, "oldest_snapshot_seqno": -1}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14031 keys, 18879027 bytes, temperature: kUnknown
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044291744, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18879027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18796651, "index_size": 46179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35141, "raw_key_size": 374462, "raw_average_key_size": 26, "raw_value_size": 18555953, "raw_average_value_size": 1322, "num_data_blocks": 1751, "num_entries": 14031, "num_filter_entries": 14031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.292195) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18879027 bytes
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.294064) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.0 rd, 162.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 16.6 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(14.1) write-amplify(6.8) OK, records in: 14584, records dropped: 553 output_compression: NoCompression
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.294094) EVENT_LOG_v1 {"time_micros": 1771841044294080, "job": 36, "event": "compaction_finished", "compaction_time_micros": 116545, "compaction_time_cpu_micros": 54004, "output_level": 6, "num_output_files": 1, "total_output_size": 18879027, "num_input_records": 14584, "num_output_records": 14031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044294687, "job": 36, "event": "table_file_deletion", "file_number": 62}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044297720, "job": 36, "event": "table_file_deletion", "file_number": 60}
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.175194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:04:04 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:04:04 np0005626463.localdomain systemd[1]: tmp-crun.pMVI1n.mount: Deactivated successfully.
Feb 23 10:04:04 np0005626463.localdomain podman[321996]: 2026-02-23 10:04:04.939361666 +0000 UTC m=+0.104976987 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:04 np0005626463.localdomain podman[321995]: 2026-02-23 10:04:04.996075729 +0000 UTC m=+0.165702034 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:04:05 np0005626463.localdomain podman[321996]: 2026-02-23 10:04:05.002510197 +0000 UTC m=+0.168125498 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:04:05 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:04:05 np0005626463.localdomain podman[321995]: 2026-02-23 10:04:05.040383409 +0000 UTC m=+0.210009694 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.7, io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, release=1770267347, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 10:04:05 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:04:05 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d04b36e0-00cb-49ab-bf81-b75179f44a78", "format": "json"}]: dispatch
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d04b36e0-00cb-49ab-bf81-b75179f44a78", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:05 np0005626463.localdomain sshd[321994]: Invalid user admin from 185.156.73.233 port 53806
Feb 23 10:04:05 np0005626463.localdomain sshd[321994]: Connection closed by invalid user admin 185.156.73.233 port 53806 [preauth]
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:05 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e49: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "format": "json"}]: dispatch
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c65f4d3c-90b6-4215-892b-9d6eb1a375b1", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: pgmap v549: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 61 KiB/s wr, 7 op/s
Feb 23 10:04:06 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "format": "json"}]: dispatch
Feb 23 10:04:07 np0005626463.localdomain ceph-mon[294160]: mgrmap e49: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:04:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:08 np0005626463.localdomain ceph-mon[294160]: pgmap v550: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 183 KiB/s wr, 39 op/s
Feb 23 10:04:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:08 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.088 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:09.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d8b8020f-4ee6-469b-affa-207d2f0b4b2c", "format": "json"}]: dispatch
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d8b8020f-4ee6-469b-affa-207d2f0b4b2c", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:09 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:04:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:04:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:04:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:04:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:04:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18836 "" "Go-http-client/1.1"
Feb 23 10:04:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "snap_name": "1ff3f892-8978-436c-bef2-ded3032f9484", "format": "json"}]: dispatch
Feb 23 10:04:10 np0005626463.localdomain ceph-mon[294160]: pgmap v551: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 123 KiB/s wr, 34 op/s
Feb 23 10:04:11 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:04:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:11 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:11 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: pgmap v552: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 160 KiB/s wr, 32 op/s
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 do_prune osdmap full prune enabled
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 e243: 6 total, 6 up, 6 in
Feb 23 10:04:12 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:04:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:04:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:04:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:04:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:04:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: osdmap e243: 6 total, 6 up, 6 in
Feb 23 10:04:13 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.123 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.167 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:14 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:14.168 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3993960e-fb7a-4270-9064-58b550d63afb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3993960e-fb7a-4270-9064-58b550d63afb", "format": "json"}]: dispatch
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: pgmap v554: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 144 KiB/s wr, 30 op/s
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:15 np0005626463.localdomain ceph-mon[294160]: pgmap v555: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 141 KiB/s wr, 29 op/s
Feb 23 10:04:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:15.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "38f81eeb-b7e7-4646-8ff6-ff03f3551f4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "38f81eeb-b7e7-4646-8ff6-ff03f3551f4f", "format": "json"}]: dispatch
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3993960e-fb7a-4270-9064-58b550d63afb", "format": "json"}]: dispatch
Feb 23 10:04:16 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3993960e-fb7a-4270-9064-58b550d63afb", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:17 np0005626463.localdomain ceph-mon[294160]: pgmap v556: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 119 KiB/s wr, 9 op/s
Feb 23 10:04:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3043800053' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:04:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/3043800053' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:18 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:18.808 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:04:18 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:04:18 np0005626463.localdomain podman[322039]: 2026-02-23 10:04:18.92714154 +0000 UTC m=+0.086517505 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 23 10:04:18 np0005626463.localdomain podman[322040]: 2026-02-23 10:04:18.983190613 +0000 UTC m=+0.137896654 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:04:18 np0005626463.localdomain podman[322040]: 2026-02-23 10:04:18.992402988 +0000 UTC m=+0.147109019 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:04:19 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:04:19 np0005626463.localdomain podman[322039]: 2026-02-23 10:04:19.046738538 +0000 UTC m=+0.206114493 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 23 10:04:19 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:04:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:19.168 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:19.171 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fd4d1e61-7ed1-400d-936e-3350eedf43bb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fd4d1e61-7ed1-400d-936e-3350eedf43bb", "format": "json"}]: dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f018492c-b089-4142-aefb-5bfccd1b64b1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: pgmap v557: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 119 KiB/s wr, 9 op/s
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f018492c-b089-4142-aefb-5bfccd1b64b1", "format": "json"}]: dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "99702c97-e987-48a2-abde-b72286105ebc", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "99702c97-e987-48a2-abde-b72286105ebc", "format": "json"}]: dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: pgmap v558: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 134 KiB/s wr, 11 op/s
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fd4d1e61-7ed1-400d-936e-3350eedf43bb", "format": "json"}]: dispatch
Feb 23 10:04:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fd4d1e61-7ed1-400d-936e-3350eedf43bb", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:23 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:04:23Z|00364|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:04:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:23.623 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f018492c-b089-4142-aefb-5bfccd1b64b1", "format": "json"}]: dispatch
Feb 23 10:04:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f018492c-b089-4142-aefb-5bfccd1b64b1", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "99702c97-e987-48a2-abde-b72286105ebc", "format": "json"}]: dispatch
Feb 23 10:04:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "99702c97-e987-48a2-abde-b72286105ebc", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:23 np0005626463.localdomain ceph-mon[294160]: pgmap v559: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 380 B/s rd, 124 KiB/s wr, 10 op/s
Feb 23 10:04:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:24.170 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:24 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:24.173 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:04:24 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:24 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:04:24 np0005626463.localdomain podman[322085]: 2026-02-23 10:04:24.919975993 +0000 UTC m=+0.092340455 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:04:24 np0005626463.localdomain podman[322085]: 2026-02-23 10:04:24.962350303 +0000 UTC m=+0.134714755 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Feb 23 10:04:24 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:04:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:04:25 np0005626463.localdomain podman[322104]: 2026-02-23 10:04:25.104829588 +0000 UTC m=+0.097356740 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 23 10:04:25 np0005626463.localdomain podman[322104]: 2026-02-23 10:04:25.141378398 +0000 UTC m=+0.133905510 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:04:25 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:04:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:04:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:04:25 np0005626463.localdomain ceph-mon[294160]: pgmap v560: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 9 op/s
Feb 23 10:04:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:26.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:26.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "38f81eeb-b7e7-4646-8ff6-ff03f3551f4f", "format": "json"}]: dispatch
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "38f81eeb-b7e7-4646-8ff6-ff03f3551f4f", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "191b0228-9167-42d0-943c-9ccefc03c217", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "191b0228-9167-42d0-943c-9ccefc03c217", "format": "json"}]: dispatch
Feb 23 10:04:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.058 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.886 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.901 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:04:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:27.902 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: pgmap v561: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 195 KiB/s wr, 15 op/s
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:28.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 10:04:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:04:28 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3001108921' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.174 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.177 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:29.214 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "13a5ec60-9f6a-43c3-975b-c3ae8bf92865", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: pgmap v562: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 131 KiB/s wr, 10 op/s
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "13a5ec60-9f6a-43c3-975b-c3ae8bf92865", "format": "json"}]: dispatch
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3811629531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:30 np0005626463.localdomain sudo[322122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:04:30 np0005626463.localdomain sudo[322122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:04:30 np0005626463.localdomain sudo[322122]: pam_unix(sudo:session): session closed for user root
Feb 23 10:04:30 np0005626463.localdomain sudo[322140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:04:30 np0005626463.localdomain sudo[322140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "191b0228-9167-42d0-943c-9ccefc03c217", "format": "json"}]: dispatch
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "191b0228-9167-42d0-943c-9ccefc03c217", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1163012827' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:31.068 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:04:31 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:31 np0005626463.localdomain sudo[322140]: pam_unix(sudo:session): session closed for user root
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:04:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:04:31 np0005626463.localdomain sudo[322189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:04:31 np0005626463.localdomain sudo[322189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:04:31 np0005626463.localdomain sudo[322189]: pam_unix(sudo:session): session closed for user root
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: pgmap v563: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 132 KiB/s wr, 12 op/s
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1919430634' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3464871891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.527 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.584064) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072584168, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 788, "num_deletes": 261, "total_data_size": 743234, "memory_usage": 758088, "flush_reason": "Manual Compaction"}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072596390, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 733430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35780, "largest_seqno": 36567, "table_properties": {"data_size": 729463, "index_size": 1566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10567, "raw_average_key_size": 20, "raw_value_size": 720714, "raw_average_value_size": 1388, "num_data_blocks": 68, "num_entries": 519, "num_filter_entries": 519, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841044, "oldest_key_time": 1771841044, "file_creation_time": 1771841072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 12373 microseconds, and 5312 cpu microseconds.
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.596 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.597 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.596449) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 733430 bytes OK
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.596480) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598722) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598744) EVENT_LOG_v1 {"time_micros": 1771841072598737, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598776) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 738909, prev total WAL file size 739233, number of live WAL files 2.
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.599443) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353235' seq:0, type:0; will stop at (end)
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(716KB)], [63(18MB)]
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072599495, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19612457, "oldest_snapshot_seqno": -1}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14002 keys, 19209445 bytes, temperature: kUnknown
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072721903, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 19209445, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19127151, "index_size": 46225, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35013, "raw_key_size": 375300, "raw_average_key_size": 26, "raw_value_size": 18886681, "raw_average_value_size": 1348, "num_data_blocks": 1745, "num_entries": 14002, "num_filter_entries": 14002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.722185) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 19209445 bytes
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.724649) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.1 rd, 156.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 18.0 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(52.9) write-amplify(26.2) OK, records in: 14550, records dropped: 548 output_compression: NoCompression
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.724669) EVENT_LOG_v1 {"time_micros": 1771841072724660, "job": 38, "event": "compaction_finished", "compaction_time_micros": 122471, "compaction_time_cpu_micros": 52435, "output_level": 6, "num_output_files": 1, "total_output_size": 19209445, "num_input_records": 14550, "num_output_records": 14002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072724900, "job": 38, "event": "table_file_deletion", "file_number": 65}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072726636, "job": 38, "event": "table_file_deletion", "file_number": 63}
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.599352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.848 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.850 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11245MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.851 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:04:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:32.851 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.035 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.036 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.036 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.097 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.171 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.172 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.187 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.220 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 10:04:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3464871891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.258 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:04:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:04:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1770454627' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.748 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.756 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.777 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.779 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:04:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:33.780 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "13a5ec60-9f6a-43c3-975b-c3ae8bf92865", "format": "json"}]: dispatch
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "13a5ec60-9f6a-43c3-975b-c3ae8bf92865", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6efe2d5a-4b19-4449-89fe-477e42835180", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6efe2d5a-4b19-4449-89fe-477e42835180", "format": "json"}]: dispatch
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: pgmap v564: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 134 KiB/s wr, 11 op/s
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1770454627' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.776 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:34.777 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:04:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:04:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:04:35 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:04:35 np0005626463.localdomain podman[322252]: 2026-02-23 10:04:35.92514173 +0000 UTC m=+0.091923663 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:04:35 np0005626463.localdomain podman[322252]: 2026-02-23 10:04:35.939333649 +0000 UTC m=+0.106115562 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:04:35 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:04:36 np0005626463.localdomain podman[322251]: 2026-02-23 10:04:35.999625762 +0000 UTC m=+0.169044276 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, version=9.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 23 10:04:36 np0005626463.localdomain podman[322251]: 2026-02-23 10:04:36.012309215 +0000 UTC m=+0.181727689 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter)
Feb 23 10:04:36 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:04:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:36.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:36 np0005626463.localdomain ceph-mon[294160]: pgmap v565: 177 pgs: 177 active+clean; 208 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 134 KiB/s wr, 10 op/s
Feb 23 10:04:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6efe2d5a-4b19-4449-89fe-477e42835180", "format": "json"}]: dispatch
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6efe2d5a-4b19-4449-89fe-477e42835180", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: pgmap v566: 177 pgs: 177 active+clean; 209 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 192 KiB/s wr, 15 op/s
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e50: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:04:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:38.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:38 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "format": "json"}]: dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: mgrmap e50: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:38 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.244 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:39.270 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:04:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:04:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:04:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:04:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:04:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18831 "" "Go-http-client/1.1"
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c2ab3a9d-9959-4847-94e0-7e993629469a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: pgmap v567: 177 pgs: 177 active+clean; 209 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 109 KiB/s wr, 9 op/s
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c2ab3a9d-9959-4847-94e0-7e993629469a", "format": "json"}]: dispatch
Feb 23 10:04:39 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "format": "json"}]: dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: pgmap v568: 177 pgs: 177 active+clean; 209 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 109 KiB/s wr, 10 op/s
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:43 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c5862b43-dbf5-4d0f-83df-5eda47d44ff1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:43 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c5862b43-dbf5-4d0f-83df-5eda47d44ff1", "format": "json"}]: dispatch
Feb 23 10:04:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:04:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:04:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:04:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:04:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:04:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:44.797 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:04:44 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:44.797 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c2ab3a9d-9959-4847-94e0-7e993629469a", "format": "json"}]: dispatch
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c2ab3a9d-9959-4847-94e0-7e993629469a", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: pgmap v569: 177 pgs: 177 active+clean; 210 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 170 KiB/s wr, 13 op/s
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.841 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:44.842 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:45 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: pgmap v570: 177 pgs: 177 active+clean; 210 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 120 KiB/s wr, 10 op/s
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:04:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:04:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:46.822 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:04:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:46.823 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 10:04:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:46.845 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 10:04:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:04:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c5862b43-dbf5-4d0f-83df-5eda47d44ff1", "format": "json"}]: dispatch
Feb 23 10:04:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c5862b43-dbf5-4d0f-83df-5eda47d44ff1", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "snap_name": "1ff3f892-8978-436c-bef2-ded3032f9484_cf854440-8378-445e-9ea8-f90ba2766e24", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "snap_name": "1ff3f892-8978-436c-bef2-ded3032f9484", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:04:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:04:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:04:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:04:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:48.565 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: pgmap v571: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 183 KiB/s wr, 15 op/s
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:04:48 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740],prefix=session evict} (starting...)
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f47619b3-d060-43cb-beb4-45b54645fdc0", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4b358cf8-a902-49d4-80e3-13df36b9098c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4b358cf8-a902-49d4-80e3-13df36b9098c", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5f193bc0-59f1-4f26-8a9d-5945320ef049", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:49 np0005626463.localdomain ceph-mon[294160]: pgmap v572: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 125 KiB/s wr, 10 op/s
Feb 23 10:04:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:04:49 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.843 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:49 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:49.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:49 np0005626463.localdomain podman[322296]: 2026-02-23 10:04:49.948565775 +0000 UTC m=+0.074580807 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 23 10:04:49 np0005626463.localdomain podman[322296]: 2026-02-23 10:04:49.956071666 +0000 UTC m=+0.082086708 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:04:49 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:04:50 np0005626463.localdomain podman[322295]: 2026-02-23 10:04:50.055584743 +0000 UTC m=+0.216019759 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 10:04:50 np0005626463.localdomain podman[322295]: 2026-02-23 10:04:50.096343853 +0000 UTC m=+0.256778879 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 23 10:04:50 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:04:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 do_prune osdmap full prune enabled
Feb 23 10:04:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 e244: 6 total, 6 up, 6 in
Feb 23 10:04:50 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in
Feb 23 10:04:50 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:04:50 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:51 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: osdmap e244: 6 total, 6 up, 6 in
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "format": "json"}]: dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:51 np0005626463.localdomain ceph-mon[294160]: pgmap v574: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 150 KiB/s wr, 13 op/s
Feb 23 10:04:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4b358cf8-a902-49d4-80e3-13df36b9098c", "format": "json"}]: dispatch
Feb 23 10:04:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4b358cf8-a902-49d4-80e3-13df36b9098c", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:53 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:04:53.799 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: pgmap v575: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 14 op/s
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:54 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:04:54Z|00365|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.876 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:54 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:54.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:04:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:04:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:04:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:04:55 np0005626463.localdomain podman[322344]: 2026-02-23 10:04:55.923980662 +0000 UTC m=+0.090279603 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 23 10:04:55 np0005626463.localdomain systemd[1]: tmp-crun.LWeMpO.mount: Deactivated successfully.
Feb 23 10:04:55 np0005626463.localdomain podman[322345]: 2026-02-23 10:04:55.987166464 +0000 UTC m=+0.143715353 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:04:56 np0005626463.localdomain podman[322345]: 2026-02-23 10:04:56.001490977 +0000 UTC m=+0.158039966 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Feb 23 10:04:56 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:04:56 np0005626463.localdomain podman[322344]: 2026-02-23 10:04:56.05883274 +0000 UTC m=+0.225131691 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:04:56 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.148 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b483bb33-2129-4e5d-acb0-9691f0f0636c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.149495', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ada6648-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'ad3873d0779aa0810f07dec5fb28355bace6915f3e5c9fa5dcbaf2d6503da93a'}]}, 'timestamp': '2026-02-23 10:04:56.156473', '_unique_id': '83384a36efdd4784a747e4ba8302584a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed91d370-934a-49c3-885d-dc529ec9fcc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.160508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1add7356-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '7e3f0f92ef057ea5eec85b9edcaaad0899ff2971762fe62380bcc5bc9b8e9705'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.160508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1add8990-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'c3b5903503049f4ea064867878d06c3accfa981c4938fd3607f7cca27e09976c'}]}, 'timestamp': '2026-02-23 10:04:56.177010', '_unique_id': 'bc6b27d7fcc940c5bc7c8c34deb212ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.179 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de403f24-e15b-4f59-a6b3-b111aca7d265', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.179626', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ade0c3a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '1da2133420d5ef754424bfe042a2edf0b0c478885860a95f8f629476a1d7a64c'}]}, 'timestamp': '2026-02-23 10:04:56.180375', '_unique_id': '54e09f552326489bb11a1b4610aac8d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceph-mon[294160]: pgmap v576: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 14 op/s
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ecccfe2-9274-497d-93e0-cb6340ef745c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.183594', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae40202-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '859c44a1cf8e63742ecc079849337edd3e188b2552693953fe718c85d63013df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.183594', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae418c8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '5d55c874b6bdb07f290de634bec94923b1b8b57f73f730ca04f2e4b073375c49'}]}, 'timestamp': '2026-02-23 10:04:56.219934', '_unique_id': '20d16bb1bfe541e2b357eff593772cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5ddbee-9f2b-4a0c-a7d5-c96081934f57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.223776', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae4c494-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '5e16149ffe304f504afc8c18d31c3544ea0dd0053d81e715ce32a99aa53f7dee'}]}, 'timestamp': '2026-02-23 10:04:56.224301', '_unique_id': 'fa02983e8a744aa6b5cdb753b54d4f3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5e51475-38d4-47f5-8920-765749077ffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.226691', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae5355a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'af723f0ea35d470832f92aafc35ec51a42d0734f9eb047a5d3950535722b3df0'}]}, 'timestamp': '2026-02-23 10:04:56.227188', '_unique_id': '1cf9d841bacc4a338c4e30fb904e39e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd7a6ea0-72ec-40d1-aaa9-e3a32bb47760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.229834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae5aff8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'd38b5cb2394ec2c9b254c2639e46202676cf8bd5652d8c5eb28a431603261e93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.229834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae5c038-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '7584869702a5884c07d5b2359a0c70ea1fb41bb512201b050c83d2b875b9df73'}]}, 'timestamp': '2026-02-23 10:04:56.230709', '_unique_id': 'fe70f00ad73e4112a8b8916777db196a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '813e51fb-0fd0-4d44-a1cc-3e02453c9337', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.232972', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae62a00-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'b57734b39e6f764a844628d91e778233b52a0a692e26b9b80e21a2edcaa51af6'}]}, 'timestamp': '2026-02-23 10:04:56.233446', '_unique_id': 'f90fdb96f13b4ec6b0eae23e970b6aab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71887e8b-e2e4-4c14-bf30-035e3dacb0a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.235576', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae68edc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '014544f5d8c790a2674ac065fa892c66ae8934b869d023a63e0521220e5005a6'}]}, 'timestamp': '2026-02-23 10:04:56.236067', '_unique_id': '8ae4d8629c36460c8caade65bd1fd4dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7020d685-eeb1-40f3-8e17-356f980a2298', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.238178', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae6f476-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '4a586facbfd28767f474d766fea113cd3652706610c83a053e831c4984855397'}]}, 'timestamp': '2026-02-23 10:04:56.238630', '_unique_id': '120f6687079042f9abfd17a514dfecce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60aa8e98-f9ce-43a9-9cc8-df05a87bc3cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.240708', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae75966-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '296d9405d991f91237d9118741a5759835458ee2ffcccd1cbd39b58647fa0cdd'}]}, 'timestamp': '2026-02-23 10:04:56.241221', '_unique_id': '4953b6c3c21747bfbd4a38eb606c3c53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf245ad-d37f-4a28-810a-4ae9bf21b98e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.243278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae7bb54-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'f2aa55c4fc81e3bcdd62cfb7b0d0719f7be766000d2454a7601b8d710264513b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.243278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae7ccca-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'caed60d2e53e30935e149e6e20afb1b214f0bc2ac110f3b8b4fb65cdb234c78d'}]}, 'timestamp': '2026-02-23 10:04:56.244141', '_unique_id': '879a6b91d5034719a0ce7093f57d704b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e45d0d7-8f23-4e2c-8fbc-d415e5f170ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.246541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae83b6a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '7a094cdff3f5c5e0189581c79e226f9c97762a861cdc2bc46f059cf77e133a86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.246541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae84ce0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '8bc24c4c29a9b3e5585e3939cd43c74b574d66ada3fbff05699570510e3a2548'}]}, 'timestamp': '2026-02-23 10:04:56.247542', '_unique_id': '6b269a460906478faec06858e79a534f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f8ce283-c990-44bd-8312-d69d82b08760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.249678', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae8b6c6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '5455cf6bb10b6629958397a5b79459ae2c08fd9b5150d61000c7608677e4d011'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.249678', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae8c76a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '43912296fcf9b23406c62adb1a94bf18ebcc2f582c981601448f9156a549eff9'}]}, 'timestamp': '2026-02-23 10:04:56.250554', '_unique_id': 'def1e7182d4b45d3a181f862acfac4dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6087281f-5421-419c-9903-162e26cdc609', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:04:56.252737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1aebc5be-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.459027746, 'message_signature': '2756da8eac731182bd9da0e5f5a562094925f3a800ec0e81dc9699c54d41832e'}]}, 'timestamp': '2026-02-23 10:04:56.270191', '_unique_id': 'ff69b64382fb49e7bbd16c51e6bee7cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26a47411-9b4c-4413-8fca-224d37da438a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.272390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aec2c98-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '8a81802d02057f414f05d0eb703dce96680970d6aace8697ac745d19ab6d1790'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.272390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aec3db4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '32ce533f13852ed104e4511d5b00e1e84fa2a9db76f0f865d5b864e02aa59d05'}]}, 'timestamp': '2026-02-23 10:04:56.273269', '_unique_id': '5001eab617d24b18941de5f1f54d39c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.275 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.275 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 16080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29096275-9287-44c3-814a-78729271a191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16080000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:04:56.275404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1aeca25e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.459027746, 'message_signature': 'e3846e6536b1ad9245f041a47297b1953894e2ea80217bcdfebf788e4a9f8b15'}]}, 'timestamp': '2026-02-23 10:04:56.275832', '_unique_id': '3b161aff77674335b22c0314d9067241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.277 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.278 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f10561f-3bfa-4f1e-b4f5-d2df26159836', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.277890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aed056e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '4d23a6e9ab04b556fff82d63966d3060fbf33305221f7c3db9a48f0837f569ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.277890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aed14dc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'ed914fb74d7b32b9793d1ab418126dc3bd3505f2f54c462189b7cefa306fdfc8'}]}, 'timestamp': '2026-02-23 10:04:56.278748', '_unique_id': '402d6a2fd7f0452db7ffaae4286823ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.280 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.280 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.281 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe7311a-956b-4b6b-92c8-02512a69ecdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.280831', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aed7814-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'af1b948c35c0b73c422079a39508e652709c25ee9a74445ce2638d6bf06d1ffb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.280831', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aed875a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'f22d55c9e5fa80e5325b839fbc3f9d3e8035d8d3b218ce5e2e92202935371dbd'}]}, 'timestamp': '2026-02-23 10:04:56.281678', '_unique_id': 'ff0d3cc49095472784836b8d781c562a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.283 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dabe0ba-de86-4714-9e41-b5c2aa446245', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.283800', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1aedebfa-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'fefec436f6dd6ab139d0d64ba137939dff77e18ab898c365e87efc5507e6f44d'}]}, 'timestamp': '2026-02-23 10:04:56.284284', '_unique_id': '12c6e54fb2304fff87cc7cd8b3eba9b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.286 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41d9381c-e2f7-477f-907d-9fc530ab3eca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.286385', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1aee4f5a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'a7a91bdf65aa60b92f5c64545cde52548ae8bcc91ffabfc3234592f80b6dae28'}]}, 'timestamp': '2026-02-23 10:04:56.286829', '_unique_id': '47d04eff04914df5b7492418d51ca2b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:04:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:04:57 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592],prefix=session evict} (starting...)
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 do_prune osdmap full prune enabled
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 e245: 6 total, 6 up, 6 in
Feb 23 10:04:57 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in
Feb 23 10:04:58 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: pgmap v577: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 164 KiB/s wr, 14 op/s
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:04:58 np0005626463.localdomain ceph-mon[294160]: osdmap e245: 6 total, 6 up, 6 in
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "11555dd1-63b1-44b4-8930-21367a0b0414", "force": true, "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:04:59 np0005626463.localdomain ceph-mon[294160]: pgmap v579: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 965 B/s rd, 194 KiB/s wr, 17 op/s
Feb 23 10:04:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:59.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:04:59 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:04:59.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:01.322 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "format": "json"}]: dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:01 np0005626463.localdomain ceph-mon[294160]: pgmap v580: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 243 KiB/s wr, 18 op/s
Feb 23 10:05:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: pgmap v581: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 150 KiB/s wr, 13 op/s
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:03 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:05:04 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:04.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:04 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:04.952 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:05:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:05:06 np0005626463.localdomain ceph-mon[294160]: pgmap v582: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 150 KiB/s wr, 13 op/s
Feb 23 10:05:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:06.345 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:05:06 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:05:06 np0005626463.localdomain podman[322379]: 2026-02-23 10:05:06.918240044 +0000 UTC m=+0.089430306 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 10:05:06 np0005626463.localdomain podman[322379]: 2026-02-23 10:05:06.933260399 +0000 UTC m=+0.104450631 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7, release=1770267347)
Feb 23 10:05:06 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:05:07 np0005626463.localdomain podman[322380]: 2026-02-23 10:05:07.025420427 +0000 UTC m=+0.190231981 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:05:07 np0005626463.localdomain podman[322380]: 2026-02-23 10:05:07.063601448 +0000 UTC m=+0.228413002 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:05:07 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:07 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8],prefix=session evict} (starting...)
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "format": "json"}]: dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "force": true, "format": "json"}]: dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: pgmap v583: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 128 KiB/s wr, 11 op/s
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:08 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:08 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:08.306 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:05:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:05:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:05:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:05:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:05:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:05:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1"
Feb 23 10:05:09 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:09.973 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: pgmap v584: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 269 B/s rd, 113 KiB/s wr, 9 op/s
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:10 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:05:10 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 23 10:05:11 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 23 10:05:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:05:12 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice", "format": "json"}]: dispatch
Feb 23 10:05:12 np0005626463.localdomain ceph-mon[294160]: pgmap v585: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 180 KiB/s wr, 16 op/s
Feb 23 10:05:12 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:12Z|00366|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:05:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:12.569 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:05:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:05:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:05:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:05:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:05:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:13 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...)
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: pgmap v586: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 114 KiB/s wr, 10 op/s
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:13 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:14 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:15.007 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:15 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:15Z|00367|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:05:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:15.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:16 np0005626463.localdomain ceph-mon[294160]: pgmap v587: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 113 KiB/s wr, 10 op/s
Feb 23 10:05:16 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:16 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:05:17 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: pgmap v588: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 163 KiB/s wr, 14 op/s
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:05:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:05:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:05:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:20.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:20 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...)
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: pgmap v589: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 122 KiB/s wr, 11 op/s
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:05:20 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:05:20 np0005626463.localdomain podman[322423]: 2026-02-23 10:05:20.895406232 +0000 UTC m=+0.070888414 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:05:20 np0005626463.localdomain podman[322423]: 2026-02-23 10:05:20.958557573 +0000 UTC m=+0.134039785 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Feb 23 10:05:20 np0005626463.localdomain systemd[1]: tmp-crun.oLd5Jl.mount: Deactivated successfully.
Feb 23 10:05:20 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:05:20 np0005626463.localdomain podman[322424]: 2026-02-23 10:05:20.976690264 +0000 UTC m=+0.148999298 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:05:20 np0005626463.localdomain podman[322424]: 2026-02-23 10:05:20.988438047 +0000 UTC m=+0.160747091 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:05:21 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:21 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:21 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:21.532 265541 INFO neutron.agent.linux.ip_lib [None req-7b396933-5536-46b6-8c4e-1d8db45bcb74 - - - - - -] Device tap97945a14-af cannot be used as it has no MAC address
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.557 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain kernel: device tap97945a14-af entered promiscuous mode
Feb 23 10:05:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:21Z|00368|binding|INFO|Claiming lport 97945a14-af1a-4a6f-ba38-cf9a96201926 for this chassis.
Feb 23 10:05:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:21Z|00369|binding|INFO|97945a14-af1a-4a6f-ba38-cf9a96201926: Claiming unknown
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain NetworkManager[5974]: <info>  [1771841121.5732] manager: (tap97945a14-af): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Feb 23 10:05:21 np0005626463.localdomain systemd-udevd[322477]: Network interface NamePolicy= disabled on kernel command line.
Feb 23 10:05:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:21.576 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e7be559e0474f2f877f7adf99941064', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ad7e38-c8d0-4a81-8fc5-3d8731b8b543, chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=97945a14-af1a-4a6f-ba38-cf9a96201926) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:05:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:21.578 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 97945a14-af1a-4a6f-ba38-cf9a96201926 in datapath 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6 bound to our chassis
Feb 23 10:05:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:21.580 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port ace0f608-b357-4062-b0dd-1f7d83446866 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 23 10:05:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:21.580 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:05:21 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:21.581 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[000882b1-ff00-46fe-b2f6-1cda86ed765b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.604 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:21Z|00370|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 ovn-installed in OVS
Feb 23 10:05:21 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:21Z|00371|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 up in Southbound
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.608 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.614 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain virtnodedevd[231253]: ethtool ioctl error on tap97945a14-af: No such device
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:21.677 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:22 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:22.185 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:22 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:05:22 np0005626463.localdomain ceph-mon[294160]: pgmap v590: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 194 KiB/s wr, 16 op/s
Feb 23 10:05:22 np0005626463.localdomain podman[322549]: 
Feb 23 10:05:22 np0005626463.localdomain podman[322549]: 2026-02-23 10:05:22.581826676 +0000 UTC m=+0.092735998 container create c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:05:22 np0005626463.localdomain systemd[1]: Started libpod-conmon-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope.
Feb 23 10:05:22 np0005626463.localdomain podman[322549]: 2026-02-23 10:05:22.534794582 +0000 UTC m=+0.045703944 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 23 10:05:22 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:05:22 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cf8b8c1279fdb39306472f422941fc3ae1cade469254f26899cabd285efde04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 23 10:05:22 np0005626463.localdomain podman[322549]: 2026-02-23 10:05:22.650803368 +0000 UTC m=+0.161712680 container init c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 23 10:05:22 np0005626463.localdomain podman[322549]: 2026-02-23 10:05:22.661990204 +0000 UTC m=+0.172899546 container start c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 23 10:05:22 np0005626463.localdomain dnsmasq[322567]: started, version 2.85 cachesize 150
Feb 23 10:05:22 np0005626463.localdomain dnsmasq[322567]: DNS service limited to local subnets
Feb 23 10:05:22 np0005626463.localdomain dnsmasq[322567]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 23 10:05:22 np0005626463.localdomain dnsmasq[322567]: warning: no upstream servers configured
Feb 23 10:05:22 np0005626463.localdomain dnsmasq-dhcp[322567]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 23 10:05:22 np0005626463.localdomain dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 0 addresses
Feb 23 10:05:22 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host
Feb 23 10:05:22 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts
Feb 23 10:05:22 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:22.912 265541 INFO neutron.agent.dhcp.agent [None req-de2561f4-4069-497e-8bd9-5861df172dea - - - - - -] DHCP configuration for ports {'c899d854-97b2-4f5c-9a4f-b3ac893e22e2'} is completed
Feb 23 10:05:22 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:23.373 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:05:23Z, description=, device_id=36bd3448-ec4f-40e6-b201-5bd21215f6b7, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f28292620d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f2829262370>], id=7dafe099-5e91-4531-bb08-a4050630ab61, ip_allocation=immediate, mac_address=fa:16:3e:cd:94:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:05:19Z, description=, dns_domain=, id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1611802295-network, port_security_enabled=True, project_id=1e7be559e0474f2f877f7adf99941064, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3734, status=ACTIVE, subnets=['5bfa3b6e-e658-4e35-be2b-09dffb18d94e'], tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:20Z, vlan_transparent=None, network_id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, port_security_enabled=False, project_id=1e7be559e0474f2f877f7adf99941064, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3742, status=DOWN, tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:23Z on network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:23 np0005626463.localdomain dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 1 addresses
Feb 23 10:05:23 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host
Feb 23 10:05:23 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts
Feb 23 10:05:23 np0005626463.localdomain podman[322585]: 2026-02-23 10:05:23.602976333 +0000 UTC m=+0.067534669 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:05:23 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:23.859 265541 INFO neutron.agent.dhcp.agent [None req-246d2597-6dee-4296-8c20-68a48c971254 - - - - - -] DHCP configuration for ports {'7dafe099-5e91-4531-bb08-a4050630ab61'} is completed
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: pgmap v591: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 122 KiB/s wr, 10 op/s
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:23 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 23 10:05:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:05:24 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:24.305 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:05:23Z, description=, device_id=36bd3448-ec4f-40e6-b201-5bd21215f6b7, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282912ea30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f282912ea90>], id=7dafe099-5e91-4531-bb08-a4050630ab61, ip_allocation=immediate, mac_address=fa:16:3e:cd:94:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:05:19Z, description=, dns_domain=, id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1611802295-network, port_security_enabled=True, project_id=1e7be559e0474f2f877f7adf99941064, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3734, status=ACTIVE, subnets=['5bfa3b6e-e658-4e35-be2b-09dffb18d94e'], tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:20Z, vlan_transparent=None, network_id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, port_security_enabled=False, project_id=1e7be559e0474f2f877f7adf99941064, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3742, status=DOWN, tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:23Z on network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6
Feb 23 10:05:24 np0005626463.localdomain dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 1 addresses
Feb 23 10:05:24 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host
Feb 23 10:05:24 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts
Feb 23 10:05:24 np0005626463.localdomain podman[322622]: 2026-02-23 10:05:24.506774613 +0000 UTC m=+0.061388789 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:05:24 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:24.872 265541 INFO neutron.agent.dhcp.agent [None req-9d74762b-d024-40a5-9b82-55f464dd2f56 - - - - - -] DHCP configuration for ports {'7dafe099-5e91-4531-bb08-a4050630ab61'} is completed
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 23 10:05:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 23 10:05:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:25.241 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:26 np0005626463.localdomain ceph-mon[294160]: pgmap v592: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 122 KiB/s wr, 10 op/s
Feb 23 10:05:26 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:05:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:05:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:05:26 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:26 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...)
Feb 23 10:05:26 np0005626463.localdomain podman[322643]: 2026-02-23 10:05:26.924534346 +0000 UTC m=+0.088126046 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:05:26 np0005626463.localdomain podman[322644]: 2026-02-23 10:05:26.983015494 +0000 UTC m=+0.143445856 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 10:05:27 np0005626463.localdomain podman[322643]: 2026-02-23 10:05:27.006556392 +0000 UTC m=+0.170148112 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 23 10:05:27 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:05:27 np0005626463.localdomain podman[322644]: 2026-02-23 10:05:27.023543927 +0000 UTC m=+0.183974239 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.43.0)
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:27 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:05:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:27.078 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:27.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:27 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: pgmap v593: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 160 KiB/s wr, 14 op/s
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:28 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.874 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.874 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.875 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:05:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:28.875 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:05:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:29.377 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:05:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:29.402 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:05:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:29.403 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:05:29 np0005626463.localdomain ceph-mon[294160]: pgmap v594: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 111 KiB/s wr, 10 op/s
Feb 23 10:05:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/487866331' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:29.683 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:30.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "tenant_id": "15d1711403cd469e88c36db6fc4b0add", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3486844311' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:05:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:31.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/974437338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:31 np0005626463.localdomain ceph-mon[294160]: pgmap v595: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 182 KiB/s wr, 16 op/s
Feb 23 10:05:32 np0005626463.localdomain sudo[322681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:05:32 np0005626463.localdomain sudo[322681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:05:32 np0005626463.localdomain sudo[322681]: pam_unix(sudo:session): session closed for user root
Feb 23 10:05:32 np0005626463.localdomain sudo[322699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:05:32 np0005626463.localdomain sudo[322699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1581869464' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:32.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:32 np0005626463.localdomain sudo[322699]: pam_unix(sudo:session): session closed for user root
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.986582) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841132986666, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1449, "num_deletes": 251, "total_data_size": 1325057, "memory_usage": 1353056, "flush_reason": "Manual Compaction"}
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841132996260, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1301381, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36568, "largest_seqno": 38016, "table_properties": {"data_size": 1295011, "index_size": 3328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16197, "raw_average_key_size": 20, "raw_value_size": 1280910, "raw_average_value_size": 1625, "num_data_blocks": 144, "num_entries": 788, "num_filter_entries": 788, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841072, "oldest_key_time": 1771841072, "file_creation_time": 1771841132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9729 microseconds, and 4964 cpu microseconds.
Feb 23 10:05:32 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.996325) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1301381 bytes OK
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.996358) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.999948) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.999971) EVENT_LOG_v1 {"time_micros": 1771841132999964, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000001) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1317951, prev total WAL file size 1317951, number of live WAL files 2.
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000723) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353234' seq:72057594037927935, type:22 .. '6B760031373735' seq:0, type:0; will stop at (end)
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1270KB)], [66(18MB)]
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133000824, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 20510826, "oldest_snapshot_seqno": -1}
Feb 23 10:05:33 np0005626463.localdomain sudo[322749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:05:33 np0005626463.localdomain sudo[322749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:05:33 np0005626463.localdomain sudo[322749]: pam_unix(sudo:session): session closed for user root
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.073 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.075 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14260 keys, 19462717 bytes, temperature: kUnknown
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133132683, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19462717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19378571, "index_size": 47373, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35717, "raw_key_size": 383151, "raw_average_key_size": 26, "raw_value_size": 19133408, "raw_average_value_size": 1341, "num_data_blocks": 1771, "num_entries": 14260, "num_filter_entries": 14260, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.133143) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19462717 bytes
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.135127) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.4 rd, 147.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 18.3 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(30.7) write-amplify(15.0) OK, records in: 14790, records dropped: 530 output_compression: NoCompression
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.135159) EVENT_LOG_v1 {"time_micros": 1771841133135144, "job": 40, "event": "compaction_finished", "compaction_time_micros": 131984, "compaction_time_cpu_micros": 63441, "output_level": 6, "num_output_files": 1, "total_output_size": 19462717, "num_input_records": 14790, "num_output_records": 14260, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133135542, "job": 40, "event": "table_file_deletion", "file_number": 68}
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133138450, "job": 40, "event": "table_file_deletion", "file_number": 66}
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0)
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:33 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...)
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2976359123' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.592 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: pgmap v596: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 110 KiB/s wr, 11 op/s
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "auth_id": "tempest-cephx-id-550070678", "format": "json"}]: dispatch
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2976359123' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.673 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.674 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.890 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:05:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11213MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.966 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.967 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:05:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:33.968 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.016 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3581738185' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.487 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.493 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.510 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.511 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:05:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:34.511 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "r", "format": "json"}]: dispatch
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3581738185' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:05:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:35.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:35.506 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:35.507 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:35 np0005626463.localdomain ceph-mon[294160]: pgmap v597: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 110 KiB/s wr, 10 op/s
Feb 23 10:05:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:05:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:05:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:36.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 do_prune osdmap full prune enabled
Feb 23 10:05:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:05:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e246 e246: 6 total, 6 up, 6 in
Feb 23 10:05:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:05:37 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e246 do_prune osdmap full prune enabled
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 e247: 6 total, 6 up, 6 in
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: osdmap e246: 6 total, 6 up, 6 in
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "812d8099-4259-45c6-8802-8b5ec410d596", "format": "json"}]: dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "812d8099-4259-45c6-8802-8b5ec410d596", "force": true, "format": "json"}]: dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: pgmap v599: 177 pgs: 177 active+clean; 237 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 8.4 KiB/s rd, 2.2 MiB/s wr, 24 op/s
Feb 23 10:05:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:05:37 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:05:37 np0005626463.localdomain systemd[1]: tmp-crun.GXQahf.mount: Deactivated successfully.
Feb 23 10:05:37 np0005626463.localdomain podman[322812]: 2026-02-23 10:05:37.935643109 +0000 UTC m=+0.088213377 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:05:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:37 np0005626463.localdomain podman[322811]: 2026-02-23 10:05:37.979070192 +0000 UTC m=+0.130984130 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9)
Feb 23 10:05:37 np0005626463.localdomain podman[322812]: 2026-02-23 10:05:37.998546514 +0000 UTC m=+0.151116782 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:05:38 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:05:38 np0005626463.localdomain podman[322811]: 2026-02-23 10:05:38.021347189 +0000 UTC m=+0.173261147 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7)
Feb 23 10:05:38 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:05:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:38 np0005626463.localdomain ceph-mon[294160]: osdmap e247: 6 total, 6 up, 6 in
Feb 23 10:05:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:05:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:05:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:05:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 23 10:05:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:05:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19314 "" "Go-http-client/1.1"
Feb 23 10:05:39 np0005626463.localdomain dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 0 addresses
Feb 23 10:05:39 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host
Feb 23 10:05:39 np0005626463.localdomain dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts
Feb 23 10:05:39 np0005626463.localdomain podman[322873]: 2026-02-23 10:05:39.78558504 +0000 UTC m=+0.069363096 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:05:39 np0005626463.localdomain ceph-mon[294160]: pgmap v601: 177 pgs: 177 active+clean; 237 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 2.6 MiB/s wr, 21 op/s
Feb 23 10:05:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:40Z|00372|binding|INFO|Releasing lport 97945a14-af1a-4a6f-ba38-cf9a96201926 from this chassis (sb_readonly=0)
Feb 23 10:05:40 np0005626463.localdomain kernel: device tap97945a14-af left promiscuous mode
Feb 23 10:05:40 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:40Z|00373|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 down in Southbound
Feb 23 10:05:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:40.037 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:40.045 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e7be559e0474f2f877f7adf99941064', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ad7e38-c8d0-4a81-8fc5-3d8731b8b543, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f808c075610>], logical_port=97945a14-af1a-4a6f-ba38-cf9a96201926) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f808c075610>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:05:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:40.047 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 97945a14-af1a-4a6f-ba38-cf9a96201926 in datapath 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6 unbound from our chassis
Feb 23 10:05:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:40.050 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 23 10:05:40 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:40.051 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fc5c99-fa1b-47e5-b8dd-3baf768006a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 23 10:05:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:40.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:05:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:40.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:40.249 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0)
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch
Feb 23 10:05:40 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished
Feb 23 10:05:41 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:41Z|00374|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:05:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:41.173 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:41 np0005626463.localdomain dnsmasq[322567]: exiting on receipt of SIGTERM
Feb 23 10:05:41 np0005626463.localdomain podman[322914]: 2026-02-23 10:05:41.620130363 +0000 UTC m=+0.066166246 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:05:41 np0005626463.localdomain systemd[1]: libpod-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope: Deactivated successfully.
Feb 23 10:05:41 np0005626463.localdomain podman[322927]: 2026-02-23 10:05:41.701192209 +0000 UTC m=+0.061866264 container died c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0)
Feb 23 10:05:41 np0005626463.localdomain podman[322927]: 2026-02-23 10:05:41.74556717 +0000 UTC m=+0.106241195 container cleanup c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:05:41 np0005626463.localdomain systemd[1]: libpod-conmon-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope: Deactivated successfully.
Feb 23 10:05:41 np0005626463.localdomain sshd[322951]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:05:41 np0005626463.localdomain sshd[322951]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 10:05:41 np0005626463.localdomain sshd[322951]: Connection closed by 120.157.18.252 port 54516
Feb 23 10:05:41 np0005626463.localdomain ceph-mon[294160]: pgmap v602: 177 pgs: 177 active+clean; 230 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 2.7 MiB/s wr, 63 op/s
Feb 23 10:05:41 np0005626463.localdomain podman[322928]: 2026-02-23 10:05:41.828493345 +0000 UTC m=+0.184397702 container remove c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:05:41 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:41.862 265541 INFO neutron.agent.dhcp.agent [None req-00909490-9987-4e52-b130-5408b2448de2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:05:41 np0005626463.localdomain neutron_dhcp_agent[265537]: 2026-02-23 10:05:41.865 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 23 10:05:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0cf8b8c1279fdb39306472f422941fc3ae1cade469254f26899cabd285efde04-merged.mount: Deactivated successfully.
Feb 23 10:05:42 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c-userdata-shm.mount: Deactivated successfully.
Feb 23 10:05:42 np0005626463.localdomain systemd[1]: run-netns-qdhcp\x2d1fd1ea36\x2d61b2\x2d4373\x2da7fa\x2d84d2547a4ab6.mount: Deactivated successfully.
Feb 23 10:05:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:05:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:05:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:05:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:05:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:05:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:05:43 np0005626463.localdomain ceph-mon[294160]: pgmap v603: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.7 MiB/s wr, 67 op/s
Feb 23 10:05:44 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:05:44Z|00375|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:05:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:44.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:05:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:05:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:05:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:45.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:05:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "format": "json"}]: dispatch
Feb 23 10:05:46 np0005626463.localdomain ceph-mon[294160]: pgmap v604: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 1.5 MiB/s wr, 44 op/s
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} v 0)
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]}]': finished
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 do_prune osdmap full prune enabled
Feb 23 10:05:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 e248: 6 total, 6 up, 6 in
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in
Feb 23 10:05:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:05:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:48.565 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:05:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:48.566 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: pgmap v605: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 78 KiB/s wr, 38 op/s
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]}]': finished
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 23 10:05:48 np0005626463.localdomain ceph-mon[294160]: osdmap e248: 6 total, 6 up, 6 in
Feb 23 10:05:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:50.254 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:50 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "auth_id": "bob", "tenant_id": "b8a78bca43aa415e9b740fe00d08afee", "access_level": "rw", "format": "json"}]: dispatch
Feb 23 10:05:50 np0005626463.localdomain ceph-mon[294160]: pgmap v607: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 78 KiB/s wr, 38 op/s
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} v 0)
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]}]': finished
Feb 23 10:05:51 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26],prefix=session evict} (starting...)
Feb 23 10:05:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:05:51 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "auth_id": "bob", "format": "json"}]: dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]}]': finished
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "auth_id": "bob", "format": "json"}]: dispatch
Feb 23 10:05:51 np0005626463.localdomain ceph-mon[294160]: pgmap v608: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 58 KiB/s wr, 6 op/s
Feb 23 10:05:51 np0005626463.localdomain podman[322957]: 2026-02-23 10:05:51.923958019 +0000 UTC m=+0.092234923 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true)
Feb 23 10:05:51 np0005626463.localdomain podman[322957]: 2026-02-23 10:05:51.973266402 +0000 UTC m=+0.141543306 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:05:51 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:05:52 np0005626463.localdomain podman[322958]: 2026-02-23 10:05:51.975088929 +0000 UTC m=+0.139062980 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:05:52 np0005626463.localdomain podman[322958]: 2026-02-23 10:05:52.058081055 +0000 UTC m=+0.222055096 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:05:52 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:05:52 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:54 np0005626463.localdomain ceph-mon[294160]: pgmap v609: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 57 KiB/s wr, 4 op/s
Feb 23 10:05:54 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0)
Feb 23 10:05:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 23 10:05:54 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Feb 23 10:05:54 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...)
Feb 23 10:05:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 23 10:05:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 23 10:05:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 23 10:05:55 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Feb 23 10:05:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:55.082 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:55.081 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:05:55 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:05:55.083 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:05:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:05:55.256 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:05:56 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "bob", "format": "json"}]: dispatch
Feb 23 10:05:56 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "auth_id": "bob", "format": "json"}]: dispatch
Feb 23 10:05:56 np0005626463.localdomain ceph-mon[294160]: pgmap v610: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 57 KiB/s wr, 4 op/s
Feb 23 10:05:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:05:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:05:57 np0005626463.localdomain podman[323006]: 2026-02-23 10:05:57.960334919 +0000 UTC m=+0.132318642 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 23 10:05:57 np0005626463.localdomain systemd[1]: tmp-crun.qdUabg.mount: Deactivated successfully.
Feb 23 10:05:57 np0005626463.localdomain podman[323007]: 2026-02-23 10:05:57.98041423 +0000 UTC m=+0.149898055 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 10:05:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:05:57 np0005626463.localdomain podman[323007]: 2026-02-23 10:05:57.989196602 +0000 UTC m=+0.158680387 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:05:58 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:05:58 np0005626463.localdomain podman[323006]: 2026-02-23 10:05:58.047959238 +0000 UTC m=+0.219942991 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 23 10:05:58 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:05:58 np0005626463.localdomain ceph-mon[294160]: pgmap v611: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s wr, 5 op/s
Feb 23 10:05:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "format": "json"}]: dispatch
Feb 23 10:05:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0f0c4f00-1527-445a-bf94-f839c0a6f476", "force": true, "format": "json"}]: dispatch
Feb 23 10:05:59 np0005626463.localdomain ceph-mon[294160]: pgmap v612: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 67 KiB/s wr, 5 op/s
Feb 23 10:06:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:00.259 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:02 np0005626463.localdomain ceph-mon[294160]: pgmap v613: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 78 KiB/s wr, 6 op/s
Feb 23 10:06:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "format": "json"}]: dispatch
Feb 23 10:06:02 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a87f3747-06f6-4188-82ee-060b8ce9fc02", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:02 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:03 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:06:03.085 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:06:04 np0005626463.localdomain ceph-mon[294160]: pgmap v614: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 49 KiB/s wr, 4 op/s
Feb 23 10:06:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:05.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:06 np0005626463.localdomain ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.107:6810/2356945423
Feb 23 10:06:06 np0005626463.localdomain ceph-mon[294160]: pgmap v615: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 49 KiB/s wr, 4 op/s
Feb 23 10:06:07 np0005626463.localdomain ceph-mon[294160]: pgmap v616: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 73 KiB/s wr, 5 op/s
Feb 23 10:06:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:06:08 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:06:08 np0005626463.localdomain podman[323044]: 2026-02-23 10:06:08.929137465 +0000 UTC m=+0.097491836 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.openshift.expose-services=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter)
Feb 23 10:06:08 np0005626463.localdomain podman[323044]: 2026-02-23 10:06:08.96586379 +0000 UTC m=+0.134218151 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7)
Feb 23 10:06:08 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:06:08 np0005626463.localdomain systemd[1]: tmp-crun.98HFpw.mount: Deactivated successfully.
Feb 23 10:06:09 np0005626463.localdomain podman[323045]: 2026-02-23 10:06:09.002684268 +0000 UTC m=+0.161255926 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:06:09 np0005626463.localdomain podman[323045]: 2026-02-23 10:06:09.010534301 +0000 UTC m=+0.169105989 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:06:09 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:06:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:06:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:06:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:06:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:06:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:06:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1"
Feb 23 10:06:09 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:09 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.267 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:10.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:10 np0005626463.localdomain ceph-mon[294160]: pgmap v617: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 38 KiB/s wr, 3 op/s
Feb 23 10:06:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "068d76ef-57bb-47e8-bc0e-5cafb295f112", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "068d76ef-57bb-47e8-bc0e-5cafb295f112", "format": "json"}]: dispatch
Feb 23 10:06:10 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 5590 writes, 38K keys, 5588 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 5590 writes, 5588 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2419 writes, 11K keys, 2417 commit groups, 1.0 writes per commit group, ingest: 11.37 MB, 0.02 MB/s
                                                           Interval WAL: 2419 writes, 2417 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    143.8      0.33              0.13        20    0.016       0      0       0.0       0.0
                                                             L6      1/0   18.56 MB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   6.7    158.1    145.1      2.19              0.90        19    0.115    239K   9915       0.0       0.0
                                                            Sum      1/0   18.56 MB   0.0      0.3     0.0      0.3       0.4      0.1       0.0   7.7    137.5    144.9      2.52              1.02        39    0.065    239K   9915       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0  12.2    146.0    147.5      1.01              0.45        16    0.063    110K   4310       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   0.0    158.1    145.1      2.19              0.90        19    0.115    239K   9915       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    145.3      0.32              0.13        19    0.017       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.046, interval 0.012
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.36 GB write, 0.30 MB/s write, 0.34 GB read, 0.29 MB/s read, 2.5 seconds
                                                           Interval compaction: 0.15 GB write, 0.25 MB/s write, 0.14 GB read, 0.24 MB/s read, 1.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 304.00 MB usage: 43.85 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.00036 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(2802,42.29 MB,13.9119%) FilterBlock(39,681.17 KB,0.218818%) IndexBlock(39,915.48 KB,0.294088%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.475940) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171476048, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 799, "num_deletes": 252, "total_data_size": 588734, "memory_usage": 603560, "flush_reason": "Manual Compaction"}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: pgmap v618: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 47 KiB/s wr, 3 op/s
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171484443, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 576994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38017, "largest_seqno": 38815, "table_properties": {"data_size": 573110, "index_size": 1611, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10096, "raw_average_key_size": 20, "raw_value_size": 564774, "raw_average_value_size": 1166, "num_data_blocks": 71, "num_entries": 484, "num_filter_entries": 484, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841133, "oldest_key_time": 1771841133, "file_creation_time": 1771841171, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 8560 microseconds, and 3257 cpu microseconds.
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.484513) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 576994 bytes OK
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.484538) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489200) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489222) EVENT_LOG_v1 {"time_micros": 1771841171489215, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489251) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 584545, prev total WAL file size 584545, number of live WAL files 2.
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489974) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(563KB)], [69(18MB)]
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171490039, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 20039711, "oldest_snapshot_seqno": -1}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14217 keys, 18348228 bytes, temperature: kUnknown
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171602284, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 18348228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18265588, "index_size": 45995, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35589, "raw_key_size": 383010, "raw_average_key_size": 26, "raw_value_size": 18022353, "raw_average_value_size": 1267, "num_data_blocks": 1708, "num_entries": 14217, "num_filter_entries": 14217, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841171, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.602613) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 18348228 bytes
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.604527) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.4 rd, 163.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.6 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(66.5) write-amplify(31.8) OK, records in: 14744, records dropped: 527 output_compression: NoCompression
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.604556) EVENT_LOG_v1 {"time_micros": 1771841171604543, "job": 42, "event": "compaction_finished", "compaction_time_micros": 112359, "compaction_time_cpu_micros": 51392, "output_level": 6, "num_output_files": 1, "total_output_size": 18348228, "num_input_records": 14744, "num_output_records": 14217, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171604785, "job": 42, "event": "table_file_deletion", "file_number": 71}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171607500, "job": 42, "event": "table_file_deletion", "file_number": 69}
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:11 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:06:12 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:06:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:06:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:06:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:06:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:06:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:06:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:13 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:14 np0005626463.localdomain ceph-mon[294160]: pgmap v619: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 33 KiB/s wr, 2 op/s
Feb 23 10:06:14 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:14 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:06:14Z|00376|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 23 10:06:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9ee6a803-0cca-4521-9b16-39980c0fff51", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9ee6a803-0cca-4521-9b16-39980c0fff51", "format": "json"}]: dispatch
Feb 23 10:06:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:15.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:16 np0005626463.localdomain ceph-mon[294160]: pgmap v620: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 2 op/s
Feb 23 10:06:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:17 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b02fa7b7-50ec-418a-bd71-13045a2641bc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b02fa7b7-50ec-418a-bd71-13045a2641bc", "format": "json"}]: dispatch
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: pgmap v621: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 51 KiB/s wr, 3 op/s
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/923462755' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:06:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/923462755' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:06:19 np0005626463.localdomain ceph-mon[294160]: pgmap v622: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Feb 23 10:06:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:20.272 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:20.274 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "35bee8f5-0cb3-465a-86d6-ab23214f69ab", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "35bee8f5-0cb3-465a-86d6-ab23214f69ab", "format": "json"}]: dispatch
Feb 23 10:06:21 np0005626463.localdomain ceph-mon[294160]: pgmap v623: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 51 KiB/s wr, 3 op/s
Feb 23 10:06:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:06:22 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:06:22 np0005626463.localdomain systemd[1]: tmp-crun.79pf9O.mount: Deactivated successfully.
Feb 23 10:06:22 np0005626463.localdomain podman[323088]: 2026-02-23 10:06:22.916587597 +0000 UTC m=+0.089538879 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:06:22 np0005626463.localdomain podman[323088]: 2026-02-23 10:06:22.980328678 +0000 UTC m=+0.153279950 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Feb 23 10:06:22 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:06:22 np0005626463.localdomain podman[323089]: 2026-02-23 10:06:22.993930348 +0000 UTC m=+0.162150294 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:06:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:23 np0005626463.localdomain podman[323089]: 2026-02-23 10:06:23.02604364 +0000 UTC m=+0.194263616 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 10:06:23 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:06:24 np0005626463.localdomain ceph-mon[294160]: pgmap v624: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s wr, 2 op/s
Feb 23 10:06:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "35bee8f5-0cb3-465a-86d6-ab23214f69ab", "format": "json"}]: dispatch
Feb 23 10:06:25 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "35bee8f5-0cb3-465a-86d6-ab23214f69ab", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:25.274 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:25.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:26 np0005626463.localdomain ceph-mon[294160]: pgmap v625: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s wr, 2 op/s
Feb 23 10:06:27 np0005626463.localdomain ceph-mon[294160]: pgmap v626: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 3 op/s
Feb 23 10:06:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:28.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:06:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:06:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:06:28 np0005626463.localdomain podman[323137]: 2026-02-23 10:06:28.915595091 +0000 UTC m=+0.081494421 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 23 10:06:28 np0005626463.localdomain podman[323137]: 2026-02-23 10:06:28.956250538 +0000 UTC m=+0.122149868 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 10:06:28 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:06:29 np0005626463.localdomain podman[323136]: 2026-02-23 10:06:28.965475963 +0000 UTC m=+0.132824367 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:06:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b02fa7b7-50ec-418a-bd71-13045a2641bc", "format": "json"}]: dispatch
Feb 23 10:06:29 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b02fa7b7-50ec-418a-bd71-13045a2641bc", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:29 np0005626463.localdomain podman[323136]: 2026-02-23 10:06:29.049390047 +0000 UTC m=+0.216738421 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Feb 23 10:06:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:29.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:29.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:06:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:29.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:06:29 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:06:30 np0005626463.localdomain ceph-mon[294160]: pgmap v627: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 48 KiB/s wr, 2 op/s
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.279 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:06:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:30.483 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:06:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4173027308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:31.087 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:06:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:31.103 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:06:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:31.103 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:06:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9ee6a803-0cca-4521-9b16-39980c0fff51", "format": "json"}]: dispatch
Feb 23 10:06:32 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9ee6a803-0cca-4521-9b16-39980c0fff51", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3574223874' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3290805769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:32 np0005626463.localdomain ceph-mon[294160]: pgmap v628: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 69 KiB/s wr, 4 op/s
Feb 23 10:06:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3505312169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:33 np0005626463.localdomain sudo[323174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:06:33 np0005626463.localdomain sudo[323174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:06:33 np0005626463.localdomain sudo[323174]: pam_unix(sudo:session): session closed for user root
Feb 23 10:06:33 np0005626463.localdomain sudo[323192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:06:33 np0005626463.localdomain sudo[323192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:06:34 np0005626463.localdomain sudo[323192]: pam_unix(sudo:session): session closed for user root
Feb 23 10:06:34 np0005626463.localdomain ceph-mon[294160]: pgmap v629: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 45 KiB/s wr, 2 op/s
Feb 23 10:06:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:06:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:06:34 np0005626463.localdomain sudo[323242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:06:34 np0005626463.localdomain sudo[323242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:06:34 np0005626463.localdomain sudo[323242]: pam_unix(sudo:session): session closed for user root
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "068d76ef-57bb-47e8-bc0e-5cafb295f112", "format": "json"}]: dispatch
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "068d76ef-57bb-47e8-bc0e-5cafb295f112", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.283 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/240892429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.587 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.656 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.657 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:06:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.914 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.916 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11214MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.916 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.917 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.977 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.978 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:06:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:35.979 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.020 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:06:36 np0005626463.localdomain ceph-mon[294160]: pgmap v630: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 45 KiB/s wr, 2 op/s
Feb 23 10:06:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/240892429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:06:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:06:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3562749387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.476 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.483 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.505 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.508 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:06:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:36.508 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:06:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3562749387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:06:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:37.504 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:37.505 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:38.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:38 np0005626463.localdomain ceph-mon[294160]: pgmap v631: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 62 KiB/s wr, 4 op/s
Feb 23 10:06:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:06:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:06:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:06:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:06:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:06:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1"
Feb 23 10:06:39 np0005626463.localdomain ceph-mon[294160]: pgmap v632: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 38 KiB/s wr, 3 op/s
Feb 23 10:06:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:06:39 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:06:39 np0005626463.localdomain podman[323305]: 2026-02-23 10:06:39.916424254 +0000 UTC m=+0.086923108 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:06:39 np0005626463.localdomain podman[323305]: 2026-02-23 10:06:39.954149601 +0000 UTC m=+0.124648445 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:06:39 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:06:40 np0005626463.localdomain podman[323304]: 2026-02-23 10:06:39.961420685 +0000 UTC m=+0.132776675 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, release=1770267347, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 10:06:40 np0005626463.localdomain podman[323304]: 2026-02-23 10:06:40.045218076 +0000 UTC m=+0.216574046 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.openshift.tags=minimal rhel9)
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:06:40 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.285 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.310 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:40.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:41 np0005626463.localdomain ceph-mon[294160]: pgmap v633: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 43 KiB/s wr, 4 op/s
Feb 23 10:06:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:06:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:06:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:06:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:06:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:06:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: pgmap v634: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 22 KiB/s wr, 2 op/s
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "format": "json"}]: dispatch
Feb 23 10:06:45 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.312 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.312 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.313 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.313 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.349 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:45.350 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:46 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "format": "json"}]: dispatch
Feb 23 10:06:46 np0005626463.localdomain ceph-mon[294160]: pgmap v635: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 22 KiB/s wr, 2 op/s
Feb 23 10:06:47 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "snap_name": "9064a32e-718f-4f83-9c53-c0c3061d4e6f", "format": "json"}]: dispatch
Feb 23 10:06:47 np0005626463.localdomain ceph-mon[294160]: pgmap v636: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 38 KiB/s wr, 3 op/s
Feb 23 10:06:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:06:48.566 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:06:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:06:48.567 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:06:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:06:48.567 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:06:49 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "snap_name": "522218e6-6786-4e43-9e8b-ad59b96bf4ec", "format": "json"}]: dispatch
Feb 23 10:06:50 np0005626463.localdomain ceph-mon[294160]: pgmap v637: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s wr, 1 op/s
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.351 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.379 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:50.381 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "snap_name": "9064a32e-718f-4f83-9c53-c0c3061d4e6f_e446a0bc-a113-46f8-af36-02f01906b501", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "snap_name": "9064a32e-718f-4f83-9c53-c0c3061d4e6f", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:52 np0005626463.localdomain ceph-mon[294160]: pgmap v638: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s wr, 3 op/s
Feb 23 10:06:52 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "snap_name": "522218e6-6786-4e43-9e8b-ad59b96bf4ec_ff4ad49f-c9ab-46b4-916a-7d71df69fd70", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:53 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "snap_name": "522218e6-6786-4e43-9e8b-ad59b96bf4ec", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:53 np0005626463.localdomain ceph-mon[294160]: pgmap v639: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s wr, 3 op/s
Feb 23 10:06:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:06:53 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:06:53 np0005626463.localdomain systemd[1]: tmp-crun.G0mkmz.mount: Deactivated successfully.
Feb 23 10:06:53 np0005626463.localdomain podman[323347]: 2026-02-23 10:06:53.92879964 +0000 UTC m=+0.097315030 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 23 10:06:53 np0005626463.localdomain systemd[1]: tmp-crun.oiorvI.mount: Deactivated successfully.
Feb 23 10:06:53 np0005626463.localdomain podman[323348]: 2026-02-23 10:06:53.983904954 +0000 UTC m=+0.148374018 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:06:53 np0005626463.localdomain podman[323348]: 2026-02-23 10:06:53.992139199 +0000 UTC m=+0.156608233 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:06:54 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:06:54 np0005626463.localdomain podman[323347]: 2026-02-23 10:06:54.04814148 +0000 UTC m=+0.216656860 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 23 10:06:54 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:06:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 do_prune osdmap full prune enabled
Feb 23 10:06:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "format": "json"}]: dispatch
Feb 23 10:06:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bb91ee88-1d36-4aac-91eb-1313f2ece1d9", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 e249: 6 total, 6 up, 6 in
Feb 23 10:06:55 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.382 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.384 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.385 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.385 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.420 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:06:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:06:55.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:06:56 np0005626463.localdomain ceph-mon[294160]: osdmap e249: 6 total, 6 up, 6 in
Feb 23 10:06:56 np0005626463.localdomain ceph-mon[294160]: pgmap v641: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 46 KiB/s wr, 3 op/s
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.148 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f71ddbd-8d31-402b-a39b-a773cc301534', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.150471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6264dea8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '1ac67800e2f6fa5758bbe59247e50fe04c413269d640e052a970e5f477afae0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.150471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6264f8ca-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '1360af677187730c691f903f04dcaa038f87676732384424850bcc43a60bf1c6'}]}, 'timestamp': '2026-02-23 10:06:56.182826', '_unique_id': '843041e16a2f4acab0803fcf6434543f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.191 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2062ca99-eec0-499b-bcc9-ed2e074cc628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.186715', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62667268-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'e2d8cfcf9d3a6f18a2d762b83ff7225d5d85be7f4518d5ac8ed56aca50a8b3e2'}]}, 'timestamp': '2026-02-23 10:06:56.192413', '_unique_id': '21bd71e7d55647d5b37ee2bd10514eff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53d2c8c8-09c5-495b-b544-3f7079e3dcb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.194773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6266e69e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'dbbf22f2522bacb9d29950d07968a93b44813fe1fc4353f7f1846b91b317ee58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.194773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6266f9cc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '0dfab44c87dcefc03cc613ed20adef23d94e809432c4eb4b4adeba645643a9d8'}]}, 'timestamp': '2026-02-23 10:06:56.195847', '_unique_id': '7729a0cd57f04cecb185d2dcb884d36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 16650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ed6dfed-b26a-4411-af9d-3f7159ac8e25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16650000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:06:56.198236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6269c990-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.402913251, 'message_signature': '3f62761134e4663adcd01a8c6282d923e3d69559eb3d3c1a9aa538df31576053'}]}, 'timestamp': '2026-02-23 10:06:56.214337', '_unique_id': '2adef1521ba0498d97f7266543f751d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a5b6b2-0ba5-471d-b682-5edebd470bed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.217005', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626a49f6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '75abbcaa37df0a76fc9a637b2e7bebcf1ee443e236d84281254080a47e4df2c5'}]}, 'timestamp': '2026-02-23 10:06:56.217585', '_unique_id': '12ddcebe6b274da59c0968e31904318d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43ceea4a-b5e1-4c65-b0c5-fa0b0a591344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.220099', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626ac20a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '7ec4ac2c1152ba89f4dd246d0a47018d967c401962c4458dc41a2445c828cf91'}]}, 'timestamp': '2026-02-23 10:06:56.220684', '_unique_id': '3336ca5306ee40e599c8256f26d82b5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e9a6908-0a09-46a6-9b43-b97218a9fe54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.223479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626b45e0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'ec5a1a9563304b42cbb2d0009d69e92ab3ce311fb6ddc781b5114e75696a0d47'}]}, 'timestamp': '2026-02-23 10:06:56.224147', '_unique_id': 'e9856e2cf2e643b2a0b5b93f00fbfece'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0999a21c-0e56-4297-9779-c7df1122549d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.226485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626bbc8c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'e38f658ae9d81fd094f63689f76ef8587a3e1e3674199d6f661b9ef10c2efba5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.226485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626bd1fe-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'ee686e58d219e8798e6854064e3e0345add842b7f9df94ab182a41b84c35fc69'}]}, 'timestamp': '2026-02-23 10:06:56.227612', '_unique_id': '71c184babfb04ce79485eba8c370955e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14f1536b-9a53-46ab-94ac-fe0ac7aff734', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.229938', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626c4300-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '0232e761e5ef2a0e08cb8b647479c0fcfa0ff3b9e1c4aa0e974ec294f62a988d'}]}, 'timestamp': '2026-02-23 10:06:56.230582', '_unique_id': '9eb2b8d39daf4591897506a35415cb6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ad4cf7e-cd2f-4e05-acfe-8d5aa09e3587', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.232849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626cb54c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '5ea3bdb22ceaad60db5418a32b066daedb2dddc6ef722698a1dcfaaf6ae20ddc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.232849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626ccb54-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '2f0d8ac59c3af8eefc2267d65d3d195340c0ec081f0418b93b050f3626eb2a61'}]}, 'timestamp': '2026-02-23 10:06:56.234033', '_unique_id': 'ba09a9e48ec54debbaa745d3a7ec2dde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ac47e51-75f6-4793-b30f-49f35f594a11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.236712', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626d4d2c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '0cadbbb5ebf60617a30b3839d0f600cb7aa5b0c80c2047f9cf34059aba78ffc9'}]}, 'timestamp': '2026-02-23 10:06:56.237327', '_unique_id': 'd86db4bf474042f79ab129613f4f2305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85754099-30e2-4321-b388-54e9a2ce21f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.239699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626f5d60-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '3fe8cfc956cc837215a24c89cdd002d049c2e379ac29d8393fd286ba91d951ec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.239699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626f72b4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': 'a710113c08d3c203c304598db8717352566a45e1c05f57159fda8747dd668c5d'}]}, 'timestamp': '2026-02-23 10:06:56.251406', '_unique_id': '0de56de8be2444f7a49d087d5618f0a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae0af908-05c0-4ae2-a59c-efb3681857b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.253807', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626fe8e8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '6b4bdb32c91fe218cf81b71fa5e55dda4280d2f52df135aed7fdd52cce46a85c'}]}, 'timestamp': '2026-02-23 10:06:56.254516', '_unique_id': 'da2c64731a5f4d14bf7b980ec6c74522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32c100cc-360f-436d-a262-92e9c36dd9ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.256727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62705a80-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'ecace1a42ee472b54f1e14d3ea07dddccc34b371e90a895b5e23eac78d7279ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.256727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62706d90-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '4be21145db42d467795147939922802ea334bccb37dc8e14bde9bbdd7898e835'}]}, 'timestamp': '2026-02-23 10:06:56.257826', '_unique_id': '3ec20b5740ad4921a6d8fc168b6e8469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47cea142-4bc2-4daf-bddf-7c54903183b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:06:56.260104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6270dc6c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.402913251, 'message_signature': 'd96763d49d4d37360e316bc555317a5aaf9b3112b9ed7fd870cfb4fb64161e58'}]}, 'timestamp': '2026-02-23 10:06:56.260669', '_unique_id': 'f756c8cf879b4e678815bbfd822998e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5725afd8-7ec7-4301-af99-7915ee15a5f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.262958', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62714bd4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '717cd262a69cea1ac07cd9d661dee18256619c0dbc3c5b4d38a28e844ab92348'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.262958', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62715ee4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '1f92cc7c60ccd2a708c0251c37cfff1b7c7263629fb9f0dbc80486e82d8e547c'}]}, 'timestamp': '2026-02-23 10:06:56.264029', '_unique_id': '414ff6517d1d4b7a932490345a4e7cfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03b13b39-ac4b-4cfe-a18f-c45b732d55d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.266311', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6271cec4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'e36b9dc39b9f60204224a4547e8c9fc77a9d33529995f1298c30c0d3c2cf4458'}]}, 'timestamp': '2026-02-23 10:06:56.266927', '_unique_id': '75056aa0bf694287948606af0364f677'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f85a8899-0999-4704-9d27-f07eacc12479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.269676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '627254a2-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '674869070745e6430dbd0e64ff02112e99b02139ec806867659421990ac24f57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.269676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62726712-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '541592fc9cd45f89cf55d56c9368ef07670f1e85bb26e90c9d4ed385d70d4b77'}]}, 'timestamp': '2026-02-23 10:06:56.270730', '_unique_id': 'dd2b0fd37e9346db845376a2820a8d28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.274 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b84f8198-ec3f-4927-a758-64bc91ea2b61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.273750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6272f42a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': 'e35c264b881158fd5babe1f1146ceddeacf6d0800fb7eaf7e1d151d5da6d485e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.273750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6273047e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '6733d3b20ccda55fdc71b22bd97e795b8aac34642afe729bce589fb46c258a92'}]}, 'timestamp': '2026-02-23 10:06:56.274738', '_unique_id': 'c9b82f73097348e7a99208189cb03bcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.276 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '334fc1f8-3000-4afb-900d-7c3bc56f2190', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.276154', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62734ad8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '29b19ebc690dccdcf4970af27ccbb041b056d45f637cd794744d29d3b0cfd3e4'}]}, 'timestamp': '2026-02-23 10:06:56.276528', '_unique_id': 'f62748a750c84151b6c4383baa8e70d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5de34e47-a66b-469f-a41e-ccc0146da6ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.277916', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62738fe8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '647b81063d54a4f1ec38daf4a8e5da9ab35b7a0d3f813e3286997c7804f79a63'}]}, 'timestamp': '2026-02-23 10:06:56.278290', '_unique_id': '8ea38ff0f2f943a7b37d4d088636737a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:06:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:06:57 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:06:57 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 do_prune osdmap full prune enabled
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 e250: 6 total, 6 up, 6 in
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "format": "json"}]: dispatch
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c02aa0e-f4f3-4457-b672-3178b42295fb", "force": true, "format": "json"}]: dispatch
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: pgmap v642: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 53 KiB/s wr, 5 op/s
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:06:58 np0005626463.localdomain ceph-mon[294160]: osdmap e250: 6 total, 6 up, 6 in
Feb 23 10:06:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "format": "json"}]: dispatch
Feb 23 10:06:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:06:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:06:59 np0005626463.localdomain podman[323394]: 2026-02-23 10:06:59.915270388 +0000 UTC m=+0.081658516 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 10:06:59 np0005626463.localdomain podman[323394]: 2026-02-23 10:06:59.931345085 +0000 UTC m=+0.097733183 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 23 10:06:59 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:07:00 np0005626463.localdomain podman[323393]: 2026-02-23 10:07:00.025565228 +0000 UTC m=+0.195626399 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 23 10:07:00 np0005626463.localdomain podman[323393]: 2026-02-23 10:07:00.055568005 +0000 UTC m=+0.225629196 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:07:00 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:00.422 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: pgmap v644: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 32 KiB/s wr, 4 op/s
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aa421a79-c1f6-4044-80a5-096747f89a69", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aa421a79-c1f6-4044-80a5-096747f89a69", "format": "json"}]: dispatch
Feb 23 10:07:00 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:01 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "snap_name": "c8d432f8-6b3a-45b5-8455-1fb016c15c44", "format": "json"}]: dispatch
Feb 23 10:07:01 np0005626463.localdomain ceph-mon[294160]: pgmap v645: 177 pgs: 177 active+clean; 221 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 95 KiB/s wr, 8 op/s
Feb 23 10:07:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:04 np0005626463.localdomain ceph-mon[294160]: pgmap v646: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 984 B/s rd, 92 KiB/s wr, 8 op/s
Feb 23 10:07:04 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 23 10:07:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aa421a79-c1f6-4044-80a5-096747f89a69", "format": "json"}]: dispatch
Feb 23 10:07:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aa421a79-c1f6-4044-80a5-096747f89a69", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "snap_name": "c8d432f8-6b3a-45b5-8455-1fb016c15c44_0e17a8dc-e179-4a46-95d3-704480393090", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:05 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "snap_name": "c8d432f8-6b3a-45b5-8455-1fb016c15c44", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.426 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.428 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:05.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:06 np0005626463.localdomain ceph-mon[294160]: pgmap v647: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 76 KiB/s wr, 6 op/s
Feb 23 10:07:07 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:07:07 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:07:08 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "format": "json"}]: dispatch
Feb 23 10:07:08 np0005626463.localdomain ceph-mon[294160]: pgmap v648: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 70 KiB/s wr, 5 op/s
Feb 23 10:07:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "format": "json"}]: dispatch
Feb 23 10:07:09 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2b92e732-df0d-4744-9403-d9d9479c95ae", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:07:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:07:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:07:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:07:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:07:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18824 "" "Go-http-client/1.1"
Feb 23 10:07:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 do_prune osdmap full prune enabled
Feb 23 10:07:10 np0005626463.localdomain ceph-mon[294160]: pgmap v649: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 360 B/s rd, 62 KiB/s wr, 4 op/s
Feb 23 10:07:10 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 e251: 6 total, 6 up, 6 in
Feb 23 10:07:10 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.456 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:10.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:07:10 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:07:10 np0005626463.localdomain systemd[1]: tmp-crun.bPn3mH.mount: Deactivated successfully.
Feb 23 10:07:10 np0005626463.localdomain podman[323432]: 2026-02-23 10:07:10.920138998 +0000 UTC m=+0.092139559 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 10:07:10 np0005626463.localdomain podman[323433]: 2026-02-23 10:07:10.965526881 +0000 UTC m=+0.133095626 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:07:10 np0005626463.localdomain podman[323432]: 2026-02-23 10:07:10.984211778 +0000 UTC m=+0.156212329 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible)
Feb 23 10:07:10 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:07:11 np0005626463.localdomain podman[323433]: 2026-02-23 10:07:11.000203663 +0000 UTC m=+0.167772398 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:07:11 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:07:11 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "snap_name": "196dffb9-7630-47d7-9de4-5970b7eeedd7", "format": "json"}]: dispatch
Feb 23 10:07:11 np0005626463.localdomain ceph-mon[294160]: osdmap e251: 6 total, 6 up, 6 in
Feb 23 10:07:12 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:12.255 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:12.256 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:07:12 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:12.258 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:07:12 np0005626463.localdomain ceph-mon[294160]: pgmap v651: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 68 KiB/s wr, 5 op/s
Feb 23 10:07:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:07:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:07:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:07:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:07:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:07:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:07:14 np0005626463.localdomain ceph-mon[294160]: pgmap v652: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 68 KiB/s wr, 5 op/s
Feb 23 10:07:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "snap_name": "196dffb9-7630-47d7-9de4-5970b7eeedd7", "target_sub_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "format": "json"}]: dispatch
Feb 23 10:07:15 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "format": "json"}]: dispatch
Feb 23 10:07:15 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:15.260 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:07:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:15.530 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:16 np0005626463.localdomain ceph-mon[294160]: pgmap v653: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 68 KiB/s wr, 5 op/s
Feb 23 10:07:16 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e51: np0005626465.hlpkwo(active, since 16m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:07:16 np0005626463.localdomain sshd[323475]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:07:17 np0005626463.localdomain ceph-mon[294160]: mgrmap e51: np0005626465.hlpkwo(active, since 16m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:07:17 np0005626463.localdomain sshd[323475]: Received disconnect from 45.148.10.152 port 61798:11:  [preauth]
Feb 23 10:07:17 np0005626463.localdomain sshd[323475]: Disconnected from authenticating user root 45.148.10.152 port 61798 [preauth]
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 do_prune osdmap full prune enabled
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 e252: 6 total, 6 up, 6 in
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: pgmap v654: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 78 KiB/s wr, 6 op/s
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/657266939' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/657266939' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:07:18 np0005626463.localdomain ceph-mon[294160]: osdmap e252: 6 total, 6 up, 6 in
Feb 23 10:07:19 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "format": "json"}]: dispatch
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: pgmap v656: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 795 B/s rd, 87 KiB/s wr, 7 op/s
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 23 10:07:20 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.532 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.533 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.534 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.534 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.563 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:20.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "format": "json"}]: dispatch
Feb 23 10:07:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 23 10:07:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "format": "json"}]: dispatch
Feb 23 10:07:21 np0005626463.localdomain ceph-mon[294160]: from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 23 10:07:22 np0005626463.localdomain ceph-mon[294160]: pgmap v657: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 62 KiB/s wr, 6 op/s
Feb 23 10:07:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:23 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "4660794f-8745-4cec-b528-d5d739724996", "format": "json"}]: dispatch
Feb 23 10:07:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "format": "json"}]: dispatch
Feb 23 10:07:24 np0005626463.localdomain ceph-mon[294160]: pgmap v658: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 63 KiB/s wr, 6 op/s
Feb 23 10:07:24 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3b946c8c-2581-45e9-b8f6-ec1026ed5801", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:07:24 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:07:24 np0005626463.localdomain podman[323477]: 2026-02-23 10:07:24.916143369 +0000 UTC m=+0.089714335 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2)
Feb 23 10:07:24 np0005626463.localdomain systemd[1]: tmp-crun.CYEM3O.mount: Deactivated successfully.
Feb 23 10:07:24 np0005626463.localdomain podman[323478]: 2026-02-23 10:07:24.983988936 +0000 UTC m=+0.155388795 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:07:25 np0005626463.localdomain podman[323478]: 2026-02-23 10:07:25.021275798 +0000 UTC m=+0.192675647 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:07:25 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:07:25 np0005626463.localdomain podman[323477]: 2026-02-23 10:07:25.071811241 +0000 UTC m=+0.245382217 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Feb 23 10:07:25 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:07:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:25.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:25.566 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:26 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "dc7adc7f-ef4c-4373-9a38-3f5ab1a74e0b", "format": "json"}]: dispatch
Feb 23 10:07:26 np0005626463.localdomain ceph-mon[294160]: pgmap v659: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 63 KiB/s wr, 6 op/s
Feb 23 10:07:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:28 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "snap_name": "196dffb9-7630-47d7-9de4-5970b7eeedd7_0eaf20cf-e6ae-41ce-b2d6-53edd51a51b8", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:28 np0005626463.localdomain ceph-mon[294160]: pgmap v660: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 59 KiB/s wr, 4 op/s
Feb 23 10:07:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:29.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:29 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:29.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.202 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:07:30 np0005626463.localdomain ceph-mon[294160]: pgmap v661: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 541 B/s rd, 52 KiB/s wr, 4 op/s
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.560 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.567 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.575 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:07:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:30.576 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:07:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:07:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:07:30 np0005626463.localdomain podman[323526]: 2026-02-23 10:07:30.925974749 +0000 UTC m=+0.093655477 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 23 10:07:30 np0005626463.localdomain podman[323525]: 2026-02-23 10:07:30.976043197 +0000 UTC m=+0.147381628 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:07:30 np0005626463.localdomain podman[323526]: 2026-02-23 10:07:30.990779482 +0000 UTC m=+0.158460220 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 10:07:31 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:07:31 np0005626463.localdomain podman[323525]: 2026-02-23 10:07:31.009322316 +0000 UTC m=+0.180660767 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:07:31 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1412922500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "snap_name": "196dffb9-7630-47d7-9de4-5970b7eeedd7", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "dc7adc7f-ef4c-4373-9a38-3f5ab1a74e0b_cf1c99c7-561b-4f60-8c3a-b48fb1569a95", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "dc7adc7f-ef4c-4373-9a38-3f5ab1a74e0b", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1412922500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:32 np0005626463.localdomain ceph-mon[294160]: pgmap v662: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 90 KiB/s wr, 6 op/s
Feb 23 10:07:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/401003035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:33 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "86119af1-6e98-43f4-a09a-0b1717aea16f", "format": "json"}]: dispatch
Feb 23 10:07:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/529488543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:34 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:34.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:34 np0005626463.localdomain sudo[323562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:07:34 np0005626463.localdomain sudo[323562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:07:34 np0005626463.localdomain sudo[323562]: pam_unix(sudo:session): session closed for user root
Feb 23 10:07:34 np0005626463.localdomain sudo[323580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:07:34 np0005626463.localdomain sudo[323580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:07:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "format": "json"}]: dispatch
Feb 23 10:07:34 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "04d2d886-4cf3-41da-bebe-433b0519fd88", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:34 np0005626463.localdomain ceph-mon[294160]: pgmap v663: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 64 KiB/s wr, 4 op/s
Feb 23 10:07:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3036817684' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:35 np0005626463.localdomain sudo[323580]: pam_unix(sudo:session): session closed for user root
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:07:35 np0005626463.localdomain sudo[323630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:07:35 np0005626463.localdomain sudo[323630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:07:35 np0005626463.localdomain sudo[323630]: pam_unix(sudo:session): session closed for user root
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.572 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.574 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 do_prune osdmap full prune enabled
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 e253: 6 total, 6 up, 6 in
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.614 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:35.615 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:07:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.075 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.076 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.095 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.095 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.096 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.096 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.097 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/6716228' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.589 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: pgmap v664: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 63 KiB/s wr, 3 op/s
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: osdmap e253: 6 total, 6 up, 6 in
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "86119af1-6e98-43f4-a09a-0b1717aea16f_c09c305a-0055-4b92-9393-c17a5f36cbfe", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "86119af1-6e98-43f4-a09a-0b1717aea16f", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/6716228' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.650 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.813 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11200MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.867 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.868 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.868 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:07:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:36.907 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:07:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:07:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/63466345' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:37.382 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:07:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:37.390 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:07:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:37.409 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:07:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:37.412 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:07:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:37.413 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:07:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/63466345' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:07:37 np0005626463.localdomain ceph-mon[294160]: pgmap v666: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 86 KiB/s wr, 6 op/s
Feb 23 10:07:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 do_prune osdmap full prune enabled
Feb 23 10:07:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e254 e254: 6 total, 6 up, 6 in
Feb 23 10:07:38 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in
Feb 23 10:07:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:38.413 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:38.414 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:39 np0005626463.localdomain ceph-mon[294160]: osdmap e254: 6 total, 6 up, 6 in
Feb 23 10:07:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:07:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:07:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:07:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:07:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:07:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1"
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e254 do_prune osdmap full prune enabled
Feb 23 10:07:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 e255: 6 total, 6 up, 6 in
Feb 23 10:07:40 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "d873c3c1-7f9f-4457-b00a-3b07bd8d3452", "format": "json"}]: dispatch
Feb 23 10:07:40 np0005626463.localdomain ceph-mon[294160]: pgmap v668: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 46 KiB/s wr, 5 op/s
Feb 23 10:07:40 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.615 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.639 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.639 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:40.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:41 np0005626463.localdomain ceph-mon[294160]: osdmap e255: 6 total, 6 up, 6 in
Feb 23 10:07:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:07:41 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:07:41 np0005626463.localdomain podman[323693]: 2026-02-23 10:07:41.920390938 +0000 UTC m=+0.086233027 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:07:41 np0005626463.localdomain podman[323693]: 2026-02-23 10:07:41.929918452 +0000 UTC m=+0.095760541 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 23 10:07:41 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:07:42 np0005626463.localdomain podman[323692]: 2026-02-23 10:07:42.019936525 +0000 UTC m=+0.188410115 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 10:07:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:42.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:07:42 np0005626463.localdomain podman[323692]: 2026-02-23 10:07:42.061324085 +0000 UTC m=+0.229797665 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 10:07:42 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:07:42 np0005626463.localdomain ceph-mon[294160]: pgmap v670: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 118 KiB/s wr, 8 op/s
Feb 23 10:07:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:07:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:07:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:07:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:07:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:07:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:07:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "d873c3c1-7f9f-4457-b00a-3b07bd8d3452_4d13e6f8-69ff-4b98-871c-1442174b91a7", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:44 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "d873c3c1-7f9f-4457-b00a-3b07bd8d3452", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:44 np0005626463.localdomain ceph-mon[294160]: pgmap v671: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 90 KiB/s wr, 7 op/s
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.641 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.643 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.671 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:45.671 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:46 np0005626463.localdomain ceph-mon[294160]: pgmap v672: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 43 KiB/s wr, 2 op/s
Feb 23 10:07:47 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "509d0785-aae6-4ab5-ae40-f641cfde0067", "format": "json"}]: dispatch
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 do_prune osdmap full prune enabled
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e256 e256: 6 total, 6 up, 6 in
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: pgmap v673: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 439 B/s rd, 69 KiB/s wr, 4 op/s
Feb 23 10:07:48 np0005626463.localdomain ceph-mon[294160]: osdmap e256: 6 total, 6 up, 6 in
Feb 23 10:07:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:07:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:07:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:07:48.569 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:07:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e256 do_prune osdmap full prune enabled
Feb 23 10:07:49 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 e257: 6 total, 6 up, 6 in
Feb 23 10:07:49 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in
Feb 23 10:07:50 np0005626463.localdomain ceph-mon[294160]: osdmap e257: 6 total, 6 up, 6 in
Feb 23 10:07:50 np0005626463.localdomain ceph-mon[294160]: pgmap v676: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 37 KiB/s wr, 2 op/s
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.705 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:50.705 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "509d0785-aae6-4ab5-ae40-f641cfde0067_7f4a989f-639e-4dce-8f6e-7269130fc579", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:51 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "509d0785-aae6-4ab5-ae40-f641cfde0067", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:52 np0005626463.localdomain ceph-mon[294160]: pgmap v677: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 71 KiB/s wr, 4 op/s
Feb 23 10:07:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:54 np0005626463.localdomain ceph-mon[294160]: pgmap v678: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 71 KiB/s wr, 4 op/s
Feb 23 10:07:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 do_prune osdmap full prune enabled
Feb 23 10:07:55 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 e258: 6 total, 6 up, 6 in
Feb 23 10:07:55 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in
Feb 23 10:07:55 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "fbb57434-352b-4604-82ba-6268d7f75f30", "format": "json"}]: dispatch
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.707 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.708 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.736 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:07:55 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:07:55.737 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:07:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:07:55 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:07:55 np0005626463.localdomain systemd[1]: tmp-crun.4Bv5rm.mount: Deactivated successfully.
Feb 23 10:07:55 np0005626463.localdomain podman[323736]: 2026-02-23 10:07:55.924584572 +0000 UTC m=+0.097667591 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 23 10:07:55 np0005626463.localdomain podman[323737]: 2026-02-23 10:07:55.970428428 +0000 UTC m=+0.139431991 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:07:55 np0005626463.localdomain podman[323737]: 2026-02-23 10:07:55.982288595 +0000 UTC m=+0.151292168 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 10:07:55 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:07:56 np0005626463.localdomain podman[323736]: 2026-02-23 10:07:56.009542238 +0000 UTC m=+0.182625317 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 23 10:07:56 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:07:56 np0005626463.localdomain ceph-mon[294160]: osdmap e258: 6 total, 6 up, 6 in
Feb 23 10:07:56 np0005626463.localdomain ceph-mon[294160]: pgmap v680: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 37 KiB/s wr, 2 op/s
Feb 23 10:07:56 np0005626463.localdomain systemd[1]: tmp-crun.qP82XC.mount: Deactivated successfully.
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 do_prune osdmap full prune enabled
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 e259: 6 total, 6 up, 6 in
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: pgmap v681: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 246 B/s rd, 51 KiB/s wr, 3 op/s
Feb 23 10:07:58 np0005626463.localdomain ceph-mon[294160]: osdmap e259: 6 total, 6 up, 6 in
Feb 23 10:07:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "fbb57434-352b-4604-82ba-6268d7f75f30_d4ac2c99-63f7-4931-9b29-6a1792d5f0fd", "force": true, "format": "json"}]: dispatch
Feb 23 10:07:59 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "fbb57434-352b-4604-82ba-6268d7f75f30", "force": true, "format": "json"}]: dispatch
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.738 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.741 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:00 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:00.744 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:00 np0005626463.localdomain ceph-mon[294160]: pgmap v683: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 19 KiB/s wr, 1 op/s
Feb 23 10:08:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:08:01 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:08:01 np0005626463.localdomain podman[323785]: 2026-02-23 10:08:01.908197111 +0000 UTC m=+0.077337152 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216)
Feb 23 10:08:01 np0005626463.localdomain ceph-mon[294160]: pgmap v684: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 63 KiB/s wr, 3 op/s
Feb 23 10:08:01 np0005626463.localdomain podman[323785]: 2026-02-23 10:08:01.943448261 +0000 UTC m=+0.112588312 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216)
Feb 23 10:08:01 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:08:02 np0005626463.localdomain podman[323786]: 2026-02-23 10:08:02.032150113 +0000 UTC m=+0.198729734 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:08:02 np0005626463.localdomain podman[323786]: 2026-02-23 10:08:02.047151607 +0000 UTC m=+0.213731258 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 23 10:08:02 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:08:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "4660794f-8745-4cec-b528-d5d739724996_30b563b9-0106-405d-8770-640aaf2912d1", "force": true, "format": "json"}]: dispatch
Feb 23 10:08:04 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "snap_name": "4660794f-8745-4cec-b528-d5d739724996", "force": true, "format": "json"}]: dispatch
Feb 23 10:08:04 np0005626463.localdomain ceph-mon[294160]: pgmap v685: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 494 B/s rd, 62 KiB/s wr, 4 op/s
Feb 23 10:08:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 do_prune osdmap full prune enabled
Feb 23 10:08:05 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e260 e260: 6 total, 6 up, 6 in
Feb 23 10:08:05 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.745 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.745 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.784 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:05 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:05.785 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e260 do_prune osdmap full prune enabled
Feb 23 10:08:06 np0005626463.localdomain ceph-mon[294160]: osdmap e260: 6 total, 6 up, 6 in
Feb 23 10:08:06 np0005626463.localdomain ceph-mon[294160]: pgmap v687: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 45 KiB/s wr, 2 op/s
Feb 23 10:08:06 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 e261: 6 total, 6 up, 6 in
Feb 23 10:08:06 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in
Feb 23 10:08:07 np0005626463.localdomain ceph-mon[294160]: osdmap e261: 6 total, 6 up, 6 in
Feb 23 10:08:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3db593df-73a6-45c0-b329-a0101e95070e", "format": "json"}]: dispatch
Feb 23 10:08:07 np0005626463.localdomain ceph-mon[294160]: from='client.15723 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3db593df-73a6-45c0-b329-a0101e95070e", "force": true, "format": "json"}]: dispatch
Feb 23 10:08:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:08 np0005626463.localdomain ceph-mon[294160]: pgmap v689: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 75 KiB/s wr, 5 op/s
Feb 23 10:08:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:08:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:08:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:08:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:08:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:08:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18831 "" "Go-http-client/1.1"
Feb 23 10:08:10 np0005626463.localdomain ceph-mon[294160]: pgmap v690: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 30 KiB/s wr, 2 op/s
Feb 23 10:08:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:10.691 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 23 10:08:10 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:10.692 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 23 10:08:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:10.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:10.786 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:10 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:10.788 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:12 np0005626463.localdomain ceph-mon[294160]: pgmap v691: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 73 KiB/s wr, 4 op/s
Feb 23 10:08:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:08:12 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:08:12 np0005626463.localdomain podman[323822]: 2026-02-23 10:08:12.900795761 +0000 UTC m=+0.076766614 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1770267347, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 23 10:08:12 np0005626463.localdomain podman[323822]: 2026-02-23 10:08:12.916306041 +0000 UTC m=+0.092276954 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 23 10:08:12 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:08:12 np0005626463.localdomain systemd[1]: tmp-crun.XOmIoc.mount: Deactivated successfully.
Feb 23 10:08:12 np0005626463.localdomain podman[323823]: 2026-02-23 10:08:12.963195671 +0000 UTC m=+0.136268034 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:08:12 np0005626463.localdomain podman[323823]: 2026-02-23 10:08:12.999352759 +0000 UTC m=+0.172425102 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:08:13 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:08:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 do_prune osdmap full prune enabled
Feb 23 10:08:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 e262: 6 total, 6 up, 6 in
Feb 23 10:08:13 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in
Feb 23 10:08:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:08:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:08:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:08:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:08:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:08:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:08:14 np0005626463.localdomain ceph-mon[294160]: osdmap e262: 6 total, 6 up, 6 in
Feb 23 10:08:14 np0005626463.localdomain ceph-mon[294160]: pgmap v693: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 73 KiB/s wr, 4 op/s
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.821 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:15 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:15.825 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:16 np0005626463.localdomain ceph-mon[294160]: pgmap v694: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 442 B/s rd, 63 KiB/s wr, 4 op/s
Feb 23 10:08:16 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:16.694 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 23 10:08:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 23 10:08:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:08:17 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 23 10:08:17 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.126359) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298126400, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1882, "num_deletes": 258, "total_data_size": 1999982, "memory_usage": 2134784, "flush_reason": "Manual Compaction"}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298136602, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1470944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38816, "largest_seqno": 40697, "table_properties": {"data_size": 1464544, "index_size": 3293, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17637, "raw_average_key_size": 21, "raw_value_size": 1450143, "raw_average_value_size": 1794, "num_data_blocks": 144, "num_entries": 808, "num_filter_entries": 808, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841173, "oldest_key_time": 1771841173, "file_creation_time": 1771841298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 10310 microseconds, and 4737 cpu microseconds.
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.136670) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1470944 bytes OK
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.136694) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140539) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140559) EVENT_LOG_v1 {"time_micros": 1771841298140553, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140583) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1991782, prev total WAL file size 1992272, number of live WAL files 2.
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.141450) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323533' seq:72057594037927935, type:22 .. '6D6772737461740034353035' seq:0, type:0; will stop at (end)
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1436KB)], [72(17MB)]
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298141493, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 19819172, "oldest_snapshot_seqno": -1}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14548 keys, 18170398 bytes, temperature: kUnknown
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298251013, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 18170398, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18088153, "index_size": 44768, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36421, "raw_key_size": 390711, "raw_average_key_size": 26, "raw_value_size": 17841712, "raw_average_value_size": 1226, "num_data_blocks": 1656, "num_entries": 14548, "num_filter_entries": 14548, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.251375) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 18170398 bytes
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.257548) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.8 rd, 165.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 17.5 +0.0 blob) out(17.3 +0.0 blob), read-write-amplify(25.8) write-amplify(12.4) OK, records in: 15025, records dropped: 477 output_compression: NoCompression
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.257576) EVENT_LOG_v1 {"time_micros": 1771841298257563, "job": 44, "event": "compaction_finished", "compaction_time_micros": 109618, "compaction_time_cpu_micros": 52698, "output_level": 6, "num_output_files": 1, "total_output_size": 18170398, "num_input_records": 15025, "num_output_records": 14548, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298257999, "job": 44, "event": "table_file_deletion", "file_number": 74}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298260762, "job": 44, "event": "table_file_deletion", "file_number": 72}
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.141367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260914) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: pgmap v695: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 41 KiB/s wr, 2 op/s
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:08:18 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:08:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:20.069 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:20 np0005626463.localdomain ceph-mon[294160]: pgmap v696: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 41 KiB/s wr, 2 op/s
Feb 23 10:08:20 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:20.822 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:22 np0005626463.localdomain ceph-mon[294160]: pgmap v697: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 8.8 KiB/s wr, 0 op/s
Feb 23 10:08:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:23 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:23.511 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:24 np0005626463.localdomain ceph-mon[294160]: pgmap v698: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 8.4 KiB/s wr, 0 op/s
Feb 23 10:08:25 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:25.826 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:26 np0005626463.localdomain ceph-mon[294160]: pgmap v699: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.2 KiB/s wr, 0 op/s
Feb 23 10:08:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:08:26 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:08:26 np0005626463.localdomain systemd[1]: tmp-crun.sY3QB4.mount: Deactivated successfully.
Feb 23 10:08:26 np0005626463.localdomain podman[323867]: 2026-02-23 10:08:26.922010427 +0000 UTC m=+0.096672280 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:08:26 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:08:26Z|00377|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0)
Feb 23 10:08:26 np0005626463.localdomain podman[323868]: 2026-02-23 10:08:26.991919037 +0000 UTC m=+0.163004050 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 10:08:27 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:27.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:27 np0005626463.localdomain podman[323867]: 2026-02-23 10:08:27.022209124 +0000 UTC m=+0.196870957 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 23 10:08:27 np0005626463.localdomain podman[323868]: 2026-02-23 10:08:27.02824258 +0000 UTC m=+0.199327593 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:08:27 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:08:27 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:08:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:28 np0005626463.localdomain ceph-mon[294160]: pgmap v700: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.2 KiB/s wr, 0 op/s
Feb 23 10:08:30 np0005626463.localdomain ceph-mon[294160]: pgmap v701: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.1 KiB/s wr, 0 op/s
Feb 23 10:08:30 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:30.868 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:31.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.484 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:08:32 np0005626463.localdomain ceph-mon[294160]: pgmap v702: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.1 KiB/s wr, 0 op/s
Feb 23 10:08:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2639862132' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:08:32 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:08:32 np0005626463.localdomain systemd[1]: tmp-crun.oP3OL8.mount: Deactivated successfully.
Feb 23 10:08:32 np0005626463.localdomain podman[323918]: 2026-02-23 10:08:32.913437047 +0000 UTC m=+0.085208235 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216)
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.923 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.944 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:08:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:32.945 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:08:32 np0005626463.localdomain podman[323919]: 2026-02-23 10:08:32.965245829 +0000 UTC m=+0.133766576 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:08:32 np0005626463.localdomain podman[323919]: 2026-02-23 10:08:32.979359286 +0000 UTC m=+0.147880003 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 23 10:08:32 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:08:32 np0005626463.localdomain podman[323918]: 2026-02-23 10:08:32.995295088 +0000 UTC m=+0.167066286 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent)
Feb 23 10:08:33 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:08:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3900623070' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:34 np0005626463.localdomain ceph-mon[294160]: pgmap v703: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/723157173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1665956422' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:35 np0005626463.localdomain sudo[323956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:08:35 np0005626463.localdomain sudo[323956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:08:35 np0005626463.localdomain sudo[323956]: pam_unix(sudo:session): session closed for user root
Feb 23 10:08:35 np0005626463.localdomain sudo[323974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:08:35 np0005626463.localdomain sudo[323974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.870 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:35 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:35.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:36.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:36 np0005626463.localdomain ceph-mon[294160]: pgmap v704: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e52: np0005626465.hlpkwo(active, since 17m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:08:36 np0005626463.localdomain sudo[323974]: pam_unix(sudo:session): session closed for user root
Feb 23 10:08:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:08:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:08:36 np0005626463.localdomain sudo[324024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:08:36 np0005626463.localdomain sudo[324024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:08:36 np0005626463.localdomain sudo[324024]: pam_unix(sudo:session): session closed for user root
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.079 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: mgrmap e52: np0005626465.hlpkwo(active, since 17m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:08:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1622287573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.559 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.618 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.619 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.887 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.888 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11199MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.889 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:08:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:37.889 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:08:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.208 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.209 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.209 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:08:38 np0005626463.localdomain ceph-mon[294160]: pgmap v705: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1622287573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.240 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:08:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:08:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2678903232' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.704 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.712 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.733 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.736 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:08:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:38.736 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:08:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2678903232' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:08:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:08:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:08:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:08:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:08:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:08:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18822 "" "Go-http-client/1.1"
Feb 23 10:08:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:39.738 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:40 np0005626463.localdomain ceph-mon[294160]: pgmap v706: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:08:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.914 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:40.946 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:41.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:08:41 np0005626463.localdomain ceph-mon[294160]: pgmap v707: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:42.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:08:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:08:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:08:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:08:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:08:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:08:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:08:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:08:43 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:08:43 np0005626463.localdomain podman[324086]: 2026-02-23 10:08:43.915749227 +0000 UTC m=+0.079969423 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 23 10:08:43 np0005626463.localdomain podman[324086]: 2026-02-23 10:08:43.932272148 +0000 UTC m=+0.096492334 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, release=1770267347, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 23 10:08:43 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:08:44 np0005626463.localdomain podman[324087]: 2026-02-23 10:08:44.018997479 +0000 UTC m=+0.179977035 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:08:44 np0005626463.localdomain podman[324087]: 2026-02-23 10:08:44.056182029 +0000 UTC m=+0.217161485 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:08:44 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:08:44 np0005626463.localdomain ceph-mon[294160]: pgmap v708: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.947 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.984 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:45 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:45.984 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:46 np0005626463.localdomain ceph-mon[294160]: pgmap v709: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:48 np0005626463.localdomain ceph-mon[294160]: pgmap v710: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s wr, 0 op/s
Feb 23 10:08:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:08:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:48.569 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:08:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:08:48.570 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:08:50 np0005626463.localdomain ceph-mon[294160]: pgmap v711: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:50.986 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:08:50 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:51.014 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:51.015 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:08:52 np0005626463.localdomain ceph-mon[294160]: pgmap v712: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:54 np0005626463.localdomain ceph-mon[294160]: pgmap v713: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:56.016 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:08:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:08:56.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.152 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8009e87f-a614-45b8-9874-f65c73dda896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:08:56.154104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a9ea2530-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.362539547, 'message_signature': '63a9e5969553b9963b4750d37752251eb8a57dd7959e011d50403b12ef79498f'}]}, 'timestamp': '2026-02-23 10:08:56.173791', '_unique_id': 'cac2a529356e4d27b7159af96e48050f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa3ef6fe-79d9-4779-bc0a-bec864c64fac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.176777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9ec4ab8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '4f7e10bcedbb4f441b488605291f9f69ff467e53490bb71cb7911a8707890b0d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.176777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9ec5d8c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '405e5d8affb23fdfc6683f704cf5b1f756fdc133f8c4b8b52122781e87b26c8c'}]}, 'timestamp': '2026-02-23 10:08:56.188251', '_unique_id': '626b37d1370048d89cd04efbdc44f6b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4021d904-30f1-40af-9ef0-d84b9c1615e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.190463', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9ed51b0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '850d38692b39139cff8b20d25d9f6c9d58f890e7f269d02c5b678d1d899d1eb8'}]}, 'timestamp': '2026-02-23 10:08:56.194541', '_unique_id': '1c266a4fe67844abb1720a1c0d4a2e02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca80bf58-e775-4564-9550-374a1a1ed391', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.196712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f26b50-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'cc69b18a833af66eb028514277d2fed2d70b2254adbf1968e47a94071ca3471e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.196712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f27f0a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '8ccf9a6da0926b4f9cca130d564367600a154e343db6f3d2d44772c1feb3cd5d'}]}, 'timestamp': '2026-02-23 10:08:56.228431', '_unique_id': '962e7c11dd3c4510a9eee7d0cbeab2db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1926fbbf-5d1e-4242-a02c-e5d432048fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.231171', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f2fcc8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'a426efd7c2334118de754c322afd8637318960a61919b6f435a36f33e0be1864'}]}, 'timestamp': '2026-02-23 10:08:56.231689', '_unique_id': 'a59248e0a5054bc3baf85dcde6af5a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7a7fde6-5f7a-4cbe-9b9c-ba2fd8307afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.233833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f364b0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'a705188eba0a6a103b0fe2446a02cd130ede5377304bc809d786dd2ca6e385a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.233833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f374b4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'ab2b2e505972d1274fc138f950eaffdec717bf3cb51ebedd2095a1793c62dcbd'}]}, 'timestamp': '2026-02-23 10:08:56.234712', '_unique_id': 'ca0803c1ad2f4bb7a6645c50cdae27bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f33bdc4-b4bc-4b92-b3c9-dc5c5746cb20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.237200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f3e688-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '7f15ad1934af7fa3359d061a0d15ac2029a1a934392a6ac2065bf12d71122914'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.237200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f3f65a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '001fb09de9f34673f98bb36c555c64f132b7e86c086616cfb44f083c5015f12e'}]}, 'timestamp': '2026-02-23 10:08:56.238066', '_unique_id': 'ee9f3d147cec4583848c256daef701fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0adc0112-2245-4f44-ba97-a635f859c225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.240214', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f45c08-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '8dd3930324257d99bf7458cae1264116f1f3a16660be3b26ada51f2c7deb5d2b'}]}, 'timestamp': '2026-02-23 10:08:56.240665', '_unique_id': '5e6082886e194950a5f1562ad5603176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaee693a-928e-40bf-aa25-642005ebf13c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.242905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f4c580-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '68ad7f26fe493987044aba8043223a4d5f4fba6a3d17e1d6c45667831382b846'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.242905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f4d6c4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'd1f4716fd2c547f6b258048db36b77162643f627c31959cb3c92fbe68df742be'}]}, 'timestamp': '2026-02-23 10:08:56.243779', '_unique_id': '2c9a4d0a712b47d28e1d67fe518795cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '581be7b9-4c1c-4db1-9932-61fbf6f84d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.245944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f53c22-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'b067b447287271ddaeeb0b44b93ad87fcd4b0883c68b3d949fc713e5b2685c95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.245944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f54bfe-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '7fc7b8a1a08467f9f34d60ec61f5ab456d47738e9c700d598abc55fd741d18d4'}]}, 'timestamp': '2026-02-23 10:08:56.246780', '_unique_id': 'f9b9a84ae8704229b3795e33bc08d914'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb3ff36c-bdf0-4725-a607-3c10b92d35cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.249106', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f5b7c4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'a094e24746ed9b037dc41147886c81841c14e8d83b7209e265d4524713be18e0'}]}, 'timestamp': '2026-02-23 10:08:56.249568', '_unique_id': '2fa164e015754bd487ef2f42fc0d60ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd2c4313-e3d1-4520-8f7c-7aa0f51a26ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.251646', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f61bba-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': 'bc582ec99d6dd56a97d24c7ecab339602976ea113e8f6ce5b366c8615136d152'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.251646', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f62bb4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '93fc29ba3004824282dfbc2ba13ed420bc5ec6362188aaae45c43a2eab10c8ee'}]}, 'timestamp': '2026-02-23 10:08:56.252505', '_unique_id': 'daf4f9114c344034a87c48834f434445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd0444b0-c5e1-40dc-8812-ddf558440aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.254741', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f694c8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '61b6c0dbd547b51e28566400e9d920174e2d1926ed1a9fd563f457c26e42cc73'}]}, 'timestamp': '2026-02-23 10:08:56.255225', '_unique_id': '18e0c76b132644cb893e3b0af0d09022'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae859397-b11a-4ede-b73e-fd67edf9b1d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.257290', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f6f756-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'db128ceecb296bd80da8d0a351f39b810e2651255ff80e2160239f114a4a2647'}]}, 'timestamp': '2026-02-23 10:08:56.257748', '_unique_id': '06b23ba9230f4f1e94919d4505220a64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3deb007b-77fb-493b-8a5c-d105d221c750', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.259804', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f75afc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '6ce56ce8d968108711e3d6d5f5c4fd0103003ba684267de6b16e0cf075afaf98'}]}, 'timestamp': '2026-02-23 10:08:56.260300', '_unique_id': 'a6db04f669d044fca298bace61e1e362'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f02ec09-d1cf-4884-9874-d6ca8788779f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.262353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f7bd44-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'fa286d883b9791a2c32133bcc19de00e3192d942f0c3955fe6c2dbdaa5b15198'}]}, 'timestamp': '2026-02-23 10:08:56.262815', '_unique_id': '82ddf7ad7a8145ca9101abfd714262f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.264 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 17210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e98e97df-f251-4eb0-b847-45f836090e14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17210000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:08:56.264612', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a9f81190-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.362539547, 'message_signature': 'e44701d7f3e779cfa1c69218679ebeced872c7826ff5b9701c1167493d26334d'}]}, 'timestamp': '2026-02-23 10:08:56.264899', '_unique_id': 'a003eaa6596d4c8d9a46257ba4eaebe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ec4ca58-4f04-4911-bedc-0f254cfdbe1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.266153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f84dd6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'cbf2b37b19f54b911d9a4c105a55bb95870b2f80395d85a00f3053a4e8ee0379'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.266153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f857c2-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '820fbfc60ccaeb2c743f5540f95ad6dcd902166a043864e68207306ebec6c517'}]}, 'timestamp': '2026-02-23 10:08:56.266666', '_unique_id': '4ad358c149e04ddeb87045cc8cb21b72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4ea06c-d4d4-419c-8191-e4ae87e06e69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.268353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f8a696-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '7b1a3213ec939d38b37bdc77ab407140511411b55a5991012d93c45d3b7c694c'}]}, 'timestamp': '2026-02-23 10:08:56.268755', '_unique_id': 'd9ad75be96764fedb1eb427f039f071e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee743eab-b899-460b-af4a-f8052c363285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.270714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f90384-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '1046cea73af8b9393d9e24a4c772f042a87a4ceb4f7aef1128d0dda0df632f78'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.270714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f9132e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '636a0baf906ccfed931399bde931cb3a83c2f4691d9fc9bf2f52c646fbcbfe04'}]}, 'timestamp': '2026-02-23 10:08:56.271527', '_unique_id': '28dfc6db346e446180a6509765b355a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a029a6ad-7b1f-4419-a511-6a82b1003e5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.273613', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f973fa-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'c100fc092a8aff870ce958b3beabc8519ac16e08ae12803842920ec9953b0b33'}]}, 'timestamp': '2026-02-23 10:08:56.274052', '_unique_id': '5056f4334d134e7389e08dff56efc6fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     yield
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 23 10:08:56 np0005626463.localdomain ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging 
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: pgmap v714: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.548283) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336548339, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 660, "num_deletes": 251, "total_data_size": 713916, "memory_usage": 726328, "flush_reason": "Manual Compaction"}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336556342, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 704350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40698, "largest_seqno": 41357, "table_properties": {"data_size": 701154, "index_size": 1115, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8123, "raw_average_key_size": 20, "raw_value_size": 694445, "raw_average_value_size": 1718, "num_data_blocks": 50, "num_entries": 404, "num_filter_entries": 404, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841298, "oldest_key_time": 1771841298, "file_creation_time": 1771841336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 8100 microseconds, and 3160 cpu microseconds.
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.556382) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 704350 bytes OK
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.556405) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558616) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558667) EVENT_LOG_v1 {"time_micros": 1771841336558660, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 710401, prev total WAL file size 710401, number of live WAL files 2.
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.559381) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(687KB)], [75(17MB)]
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336559427, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 18874748, "oldest_snapshot_seqno": -1}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14430 keys, 17438139 bytes, temperature: kUnknown
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336674049, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 17438139, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17358252, "index_size": 42730, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36101, "raw_key_size": 388746, "raw_average_key_size": 26, "raw_value_size": 17115375, "raw_average_value_size": 1186, "num_data_blocks": 1566, "num_entries": 14430, "num_filter_entries": 14430, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.674386) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 17438139 bytes
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.676359) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.6 rd, 152.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 17.3 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(51.6) write-amplify(24.8) OK, records in: 14952, records dropped: 522 output_compression: NoCompression
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.676388) EVENT_LOG_v1 {"time_micros": 1771841336676374, "job": 46, "event": "compaction_finished", "compaction_time_micros": 114692, "compaction_time_cpu_micros": 47131, "output_level": 6, "num_output_files": 1, "total_output_size": 17438139, "num_input_records": 14952, "num_output_records": 14430, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336676622, "job": 46, "event": "table_file_deletion", "file_number": 77}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336679045, "job": 46, "event": "table_file_deletion", "file_number": 75}
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.559294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:56 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:08:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:08:57 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:08:57 np0005626463.localdomain podman[324129]: 2026-02-23 10:08:57.908810794 +0000 UTC m=+0.079836029 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 23 10:08:57 np0005626463.localdomain podman[324130]: 2026-02-23 10:08:57.962086481 +0000 UTC m=+0.130540587 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:08:57 np0005626463.localdomain podman[324130]: 2026-02-23 10:08:57.974465113 +0000 UTC m=+0.142919219 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:08:57 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:08:58 np0005626463.localdomain podman[324129]: 2026-02-23 10:08:58.025047527 +0000 UTC m=+0.196072802 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:08:58 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:08:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:08:58 np0005626463.localdomain ceph-mon[294160]: pgmap v715: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:08:59 np0005626463.localdomain sshd[324176]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:08:59 np0005626463.localdomain sshd[324176]: Accepted publickey for zuul from 38.102.83.114 port 56006 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:08:59 np0005626463.localdomain systemd-logind[759]: New session 72 of user zuul.
Feb 23 10:08:59 np0005626463.localdomain systemd[1]: Started Session 72 of User zuul.
Feb 23 10:08:59 np0005626463.localdomain sshd[324176]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:08:59 np0005626463.localdomain ovn_controller[157695]: 2026-02-23T10:08:59Z|00378|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 23 10:08:59 np0005626463.localdomain sudo[324196]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgnqtqakuahjsygvodhknwzbdsscuumm ; /usr/bin/python3
Feb 23 10:08:59 np0005626463.localdomain sudo[324196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:08:59 np0005626463.localdomain python3[324198]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-f788-b23e-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 23 10:09:00 np0005626463.localdomain sudo[324196]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:00 np0005626463.localdomain ceph-mon[294160]: pgmap v716: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.021 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.022 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.022 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:01.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:09:01 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 21K writes, 7586 syncs, 2.88 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 36.14 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4718 syncs, 2.46 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:09:02 np0005626463.localdomain ceph-mon[294160]: pgmap v717: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:09:03 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:09:03 np0005626463.localdomain podman[324202]: 2026-02-23 10:09:03.922749651 +0000 UTC m=+0.092438979 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 23 10:09:03 np0005626463.localdomain podman[324202]: 2026-02-23 10:09:03.934359181 +0000 UTC m=+0.104048529 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 23 10:09:03 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:09:04 np0005626463.localdomain podman[324201]: 2026-02-23 10:09:04.029403939 +0000 UTC m=+0.199056865 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 10:09:04 np0005626463.localdomain podman[324201]: 2026-02-23 10:09:04.038279783 +0000 UTC m=+0.207932719 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 23 10:09:04 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:09:04 np0005626463.localdomain ceph-mon[294160]: pgmap v718: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:05 np0005626463.localdomain sshd[324176]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:05 np0005626463.localdomain systemd[1]: session-72.scope: Deactivated successfully.
Feb 23 10:09:05 np0005626463.localdomain systemd-logind[759]: Session 72 logged out. Waiting for processes to exit.
Feb 23 10:09:05 np0005626463.localdomain systemd-logind[759]: Removed session 72.
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.054 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.073 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:06.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:09:06 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 21K writes, 7360 syncs, 2.95 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 33.66 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4546 syncs, 2.50 writes per sync, written: 0.03 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:09:06 np0005626463.localdomain ceph-mon[294160]: pgmap v719: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:08 np0005626463.localdomain ceph-mon[294160]: pgmap v720: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:09:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:09:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:09:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:09:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:09:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18833 "" "Go-http-client/1.1"
Feb 23 10:09:10 np0005626463.localdomain ceph-mon[294160]: pgmap v721: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.075 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.103 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.104 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.104 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.105 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:11.106 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:12 np0005626463.localdomain ceph-mon[294160]: pgmap v722: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:09:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:09:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:09:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:09:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:09:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:09:14 np0005626463.localdomain ceph-mon[294160]: pgmap v723: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:09:14 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:09:14 np0005626463.localdomain systemd[1]: tmp-crun.HCEgzB.mount: Deactivated successfully.
Feb 23 10:09:14 np0005626463.localdomain podman[324239]: 2026-02-23 10:09:14.921284012 +0000 UTC m=+0.090092417 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:09:14 np0005626463.localdomain podman[324239]: 2026-02-23 10:09:14.955635333 +0000 UTC m=+0.124443728 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:09:14 np0005626463.localdomain podman[324238]: 2026-02-23 10:09:14.970697759 +0000 UTC m=+0.142947410 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Feb 23 10:09:14 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:09:14 np0005626463.localdomain podman[324238]: 2026-02-23 10:09:14.989394247 +0000 UTC m=+0.161643938 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-type=git)
Feb 23 10:09:15 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.107 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:16.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:16 np0005626463.localdomain ceph-mon[294160]: pgmap v724: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:17 np0005626463.localdomain sshd[324282]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:17 np0005626463.localdomain sshd[324282]: Accepted publickey for zuul from 38.102.83.114 port 34872 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:17 np0005626463.localdomain systemd-logind[759]: New session 73 of user zuul.
Feb 23 10:09:17 np0005626463.localdomain systemd[1]: Started Session 73 of User zuul.
Feb 23 10:09:17 np0005626463.localdomain sshd[324282]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:17 np0005626463.localdomain sudo[324286]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Feb 23 10:09:17 np0005626463.localdomain sudo[324286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:17 np0005626463.localdomain ceph-mon[294160]: pgmap v725: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/763389869' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:09:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/763389869' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:09:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:18 np0005626463.localdomain sudo[324286]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:18 np0005626463.localdomain sshd[324285]: Received disconnect from 38.102.83.114 port 34872:11: disconnected by user
Feb 23 10:09:18 np0005626463.localdomain sshd[324285]: Disconnected from user zuul 38.102.83.114 port 34872
Feb 23 10:09:18 np0005626463.localdomain sshd[324282]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:18 np0005626463.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Feb 23 10:09:18 np0005626463.localdomain systemd[1]: session-73.scope: Consumed 1.125s CPU time.
Feb 23 10:09:18 np0005626463.localdomain systemd-logind[759]: Session 73 logged out. Waiting for processes to exit.
Feb 23 10:09:18 np0005626463.localdomain systemd-logind[759]: Removed session 73.
Feb 23 10:09:19 np0005626463.localdomain sshd[324304]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:19 np0005626463.localdomain sshd[324304]: Accepted publickey for zuul from 38.102.83.114 port 34884 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:19 np0005626463.localdomain systemd-logind[759]: New session 74 of user zuul.
Feb 23 10:09:19 np0005626463.localdomain systemd[1]: Started Session 74 of User zuul.
Feb 23 10:09:19 np0005626463.localdomain sshd[324304]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:19 np0005626463.localdomain sudo[324308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Feb 23 10:09:19 np0005626463.localdomain sudo[324308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:19 np0005626463.localdomain sudo[324308]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:19 np0005626463.localdomain sshd[324307]: Received disconnect from 38.102.83.114 port 34884:11: disconnected by user
Feb 23 10:09:19 np0005626463.localdomain sshd[324307]: Disconnected from user zuul 38.102.83.114 port 34884
Feb 23 10:09:19 np0005626463.localdomain sshd[324304]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:19 np0005626463.localdomain systemd[1]: session-74.scope: Deactivated successfully.
Feb 23 10:09:19 np0005626463.localdomain systemd-logind[759]: Session 74 logged out. Waiting for processes to exit.
Feb 23 10:09:19 np0005626463.localdomain systemd-logind[759]: Removed session 74.
Feb 23 10:09:19 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:19.818 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:19 np0005626463.localdomain sshd[324326]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:20 np0005626463.localdomain sshd[324326]: Accepted publickey for zuul from 38.102.83.114 port 34886 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: New session 75 of user zuul.
Feb 23 10:09:20 np0005626463.localdomain systemd[1]: Started Session 75 of User zuul.
Feb 23 10:09:20 np0005626463.localdomain sshd[324326]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:20 np0005626463.localdomain sudo[324330]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Feb 23 10:09:20 np0005626463.localdomain sudo[324330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:20 np0005626463.localdomain sudo[324330]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:20 np0005626463.localdomain sshd[324329]: Received disconnect from 38.102.83.114 port 34886:11: disconnected by user
Feb 23 10:09:20 np0005626463.localdomain sshd[324329]: Disconnected from user zuul 38.102.83.114 port 34886
Feb 23 10:09:20 np0005626463.localdomain sshd[324326]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:20 np0005626463.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: Session 75 logged out. Waiting for processes to exit.
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: Removed session 75.
Feb 23 10:09:20 np0005626463.localdomain sshd[324348]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:20 np0005626463.localdomain ceph-mon[294160]: pgmap v726: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:20 np0005626463.localdomain sshd[324348]: Accepted publickey for zuul from 38.102.83.114 port 34892 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: New session 76 of user zuul.
Feb 23 10:09:20 np0005626463.localdomain systemd[1]: Started Session 76 of User zuul.
Feb 23 10:09:20 np0005626463.localdomain sshd[324348]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:20 np0005626463.localdomain sudo[324352]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Feb 23 10:09:20 np0005626463.localdomain sudo[324352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:20 np0005626463.localdomain sudo[324352]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:20 np0005626463.localdomain sshd[324351]: Received disconnect from 38.102.83.114 port 34892:11: disconnected by user
Feb 23 10:09:20 np0005626463.localdomain sshd[324351]: Disconnected from user zuul 38.102.83.114 port 34892
Feb 23 10:09:20 np0005626463.localdomain sshd[324348]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:20 np0005626463.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: Session 76 logged out. Waiting for processes to exit.
Feb 23 10:09:20 np0005626463.localdomain systemd-logind[759]: Removed session 76.
Feb 23 10:09:21 np0005626463.localdomain sshd[324370]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.142 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.145 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.145 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.146 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.187 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:21.187 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:21 np0005626463.localdomain sshd[324370]: Accepted publickey for zuul from 38.102.83.114 port 34896 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: New session 77 of user zuul.
Feb 23 10:09:21 np0005626463.localdomain systemd[1]: Started Session 77 of User zuul.
Feb 23 10:09:21 np0005626463.localdomain sshd[324370]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:21 np0005626463.localdomain sudo[324374]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Feb 23 10:09:21 np0005626463.localdomain sudo[324374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:21 np0005626463.localdomain sudo[324374]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:21 np0005626463.localdomain sshd[324373]: Received disconnect from 38.102.83.114 port 34896:11: disconnected by user
Feb 23 10:09:21 np0005626463.localdomain sshd[324373]: Disconnected from user zuul 38.102.83.114 port 34896
Feb 23 10:09:21 np0005626463.localdomain sshd[324370]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:21 np0005626463.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: Session 77 logged out. Waiting for processes to exit.
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: Removed session 77.
Feb 23 10:09:21 np0005626463.localdomain sshd[324392]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:21 np0005626463.localdomain sshd[324392]: Accepted publickey for zuul from 38.102.83.114 port 34910 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: New session 78 of user zuul.
Feb 23 10:09:21 np0005626463.localdomain systemd[1]: Started Session 78 of User zuul.
Feb 23 10:09:21 np0005626463.localdomain sshd[324392]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:21 np0005626463.localdomain sudo[324396]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Feb 23 10:09:21 np0005626463.localdomain sudo[324396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:21 np0005626463.localdomain sudo[324396]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:21 np0005626463.localdomain sshd[324395]: Received disconnect from 38.102.83.114 port 34910:11: disconnected by user
Feb 23 10:09:21 np0005626463.localdomain sshd[324395]: Disconnected from user zuul 38.102.83.114 port 34910
Feb 23 10:09:21 np0005626463.localdomain sshd[324392]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:21 np0005626463.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: Session 78 logged out. Waiting for processes to exit.
Feb 23 10:09:21 np0005626463.localdomain systemd-logind[759]: Removed session 78.
Feb 23 10:09:22 np0005626463.localdomain sshd[324414]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:22 np0005626463.localdomain sshd[324414]: Accepted publickey for zuul from 38.102.83.114 port 34924 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:22 np0005626463.localdomain systemd-logind[759]: New session 79 of user zuul.
Feb 23 10:09:22 np0005626463.localdomain systemd[1]: Started Session 79 of User zuul.
Feb 23 10:09:22 np0005626463.localdomain sshd[324414]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:22 np0005626463.localdomain sudo[324418]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Feb 23 10:09:22 np0005626463.localdomain sudo[324418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:22 np0005626463.localdomain sudo[324418]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:22 np0005626463.localdomain sshd[324417]: Received disconnect from 38.102.83.114 port 34924:11: disconnected by user
Feb 23 10:09:22 np0005626463.localdomain sshd[324417]: Disconnected from user zuul 38.102.83.114 port 34924
Feb 23 10:09:22 np0005626463.localdomain sshd[324414]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:22 np0005626463.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Feb 23 10:09:22 np0005626463.localdomain systemd-logind[759]: Session 79 logged out. Waiting for processes to exit.
Feb 23 10:09:22 np0005626463.localdomain systemd-logind[759]: Removed session 79.
Feb 23 10:09:22 np0005626463.localdomain ceph-mon[294160]: pgmap v727: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:22 np0005626463.localdomain sshd[324436]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:22 np0005626463.localdomain sshd[324436]: Accepted publickey for zuul from 38.102.83.114 port 34936 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:22 np0005626463.localdomain systemd-logind[759]: New session 80 of user zuul.
Feb 23 10:09:22 np0005626463.localdomain systemd[1]: Started Session 80 of User zuul.
Feb 23 10:09:22 np0005626463.localdomain sshd[324436]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:23 np0005626463.localdomain sudo[324440]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Feb 23 10:09:23 np0005626463.localdomain sudo[324440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:23 np0005626463.localdomain sudo[324440]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:23 np0005626463.localdomain sshd[324439]: Received disconnect from 38.102.83.114 port 34936:11: disconnected by user
Feb 23 10:09:23 np0005626463.localdomain sshd[324439]: Disconnected from user zuul 38.102.83.114 port 34936
Feb 23 10:09:23 np0005626463.localdomain sshd[324436]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:23 np0005626463.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Feb 23 10:09:23 np0005626463.localdomain systemd-logind[759]: Session 80 logged out. Waiting for processes to exit.
Feb 23 10:09:23 np0005626463.localdomain systemd-logind[759]: Removed session 80.
Feb 23 10:09:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:23 np0005626463.localdomain sshd[324458]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:09:23 np0005626463.localdomain sshd[324458]: Accepted publickey for zuul from 38.102.83.114 port 34946 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:09:23 np0005626463.localdomain systemd-logind[759]: New session 81 of user zuul.
Feb 23 10:09:23 np0005626463.localdomain systemd[1]: Started Session 81 of User zuul.
Feb 23 10:09:23 np0005626463.localdomain sshd[324458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:09:23 np0005626463.localdomain sudo[324462]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Feb 23 10:09:23 np0005626463.localdomain sudo[324462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:09:23 np0005626463.localdomain sudo[324462]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:23 np0005626463.localdomain sshd[324461]: Received disconnect from 38.102.83.114 port 34946:11: disconnected by user
Feb 23 10:09:23 np0005626463.localdomain sshd[324461]: Disconnected from user zuul 38.102.83.114 port 34946
Feb 23 10:09:23 np0005626463.localdomain sshd[324458]: pam_unix(sshd:session): session closed for user zuul
Feb 23 10:09:23 np0005626463.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Feb 23 10:09:23 np0005626463.localdomain systemd-logind[759]: Session 81 logged out. Waiting for processes to exit.
Feb 23 10:09:23 np0005626463.localdomain systemd-logind[759]: Removed session 81.
Feb 23 10:09:24 np0005626463.localdomain ceph-mon[294160]: pgmap v728: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.189 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.191 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.191 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.191 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.192 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:26.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:26 np0005626463.localdomain ceph-mon[294160]: pgmap v729: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:28 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:28.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 23 10:09:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:28 np0005626463.localdomain ceph-mon[294160]: pgmap v730: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:09:28 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:09:28 np0005626463.localdomain systemd[1]: tmp-crun.tOyCCQ.mount: Deactivated successfully.
Feb 23 10:09:28 np0005626463.localdomain podman[324480]: 2026-02-23 10:09:28.944127716 +0000 UTC m=+0.106202135 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 23 10:09:29 np0005626463.localdomain podman[324481]: 2026-02-23 10:09:29.041072723 +0000 UTC m=+0.202602304 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 10:09:29 np0005626463.localdomain podman[324481]: 2026-02-23 10:09:29.054246931 +0000 UTC m=+0.215776542 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 23 10:09:29 np0005626463.localdomain podman[324480]: 2026-02-23 10:09:29.068165301 +0000 UTC m=+0.230239690 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true)
Feb 23 10:09:29 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:09:29 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:09:30 np0005626463.localdomain ceph-mon[294160]: pgmap v731: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.069 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.069 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.195 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.197 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.197 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.197 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.226 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:31.227 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.472 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.473 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.473 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.473 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:09:32 np0005626463.localdomain ceph-mon[294160]: pgmap v732: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.817 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.832 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:09:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:32.832 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:09:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:34 np0005626463.localdomain ceph-mon[294160]: pgmap v733: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:09:34 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:09:34 np0005626463.localdomain podman[324529]: 2026-02-23 10:09:34.916638292 +0000 UTC m=+0.086316870 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 23 10:09:34 np0005626463.localdomain podman[324529]: 2026-02-23 10:09:34.926106245 +0000 UTC m=+0.095784783 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 23 10:09:34 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:09:34 np0005626463.localdomain podman[324530]: 2026-02-23 10:09:34.968891218 +0000 UTC m=+0.135799180 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 23 10:09:34 np0005626463.localdomain podman[324530]: 2026-02-23 10:09:34.98221995 +0000 UTC m=+0.149127882 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 23 10:09:34 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:09:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2170495665' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.228 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.230 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.230 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.230 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.267 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:36.268 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:36 np0005626463.localdomain ceph-mon[294160]: pgmap v734: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3582362366' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:36 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e53: np0005626465.hlpkwo(active, since 18m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:09:36 np0005626463.localdomain sudo[324564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:09:36 np0005626463.localdomain sudo[324564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:09:36 np0005626463.localdomain sudo[324564]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:36 np0005626463.localdomain sudo[324582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:09:36 np0005626463.localdomain sudo[324582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:09:37 np0005626463.localdomain ceph-mon[294160]: mgrmap e53: np0005626465.hlpkwo(active, since 18m), standbys: np0005626463.wtksup, np0005626466.nisqfq
Feb 23 10:09:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1764422679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:37 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:37.828 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:37 np0005626463.localdomain sudo[324582]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:09:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:09:37 np0005626463.localdomain sudo[324633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:09:37 np0005626463.localdomain sudo[324633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:09:37 np0005626463.localdomain sudo[324633]: pam_unix(sudo:session): session closed for user root
Feb 23 10:09:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:38.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1259935457' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: pgmap v735: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:09:38 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.079 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:09:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:09:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:09:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:09:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:09:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:09:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18837 "" "Go-http-client/1.1"
Feb 23 10:09:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:09:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2929933948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.546 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:09:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2929933948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.616 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.617 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.851 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.853 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11192MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.853 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.854 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.978 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.979 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:09:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:39.980 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.025 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.090 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.091 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.113 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.145 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.185 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:09:40 np0005626463.localdomain ceph-mon[294160]: pgmap v736: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:09:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3085463583' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.658 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.665 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.680 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.682 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:09:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:40.683 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:09:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:09:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.290 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.292 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.292 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5024 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.292 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.293 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.296 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3085463583' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:09:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:09:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:41.679 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:42 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:42.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:42 np0005626463.localdomain ceph-mon[294160]: pgmap v737: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:43 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:43.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:09:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:09:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:09:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:09:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:09:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:09:44 np0005626463.localdomain ceph-mon[294160]: pgmap v738: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:09:45 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:09:45 np0005626463.localdomain podman[324696]: 2026-02-23 10:09:45.928034752 +0000 UTC m=+0.091195901 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:09:45 np0005626463.localdomain podman[324696]: 2026-02-23 10:09:45.940413754 +0000 UTC m=+0.103574903 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:09:45 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:09:46 np0005626463.localdomain podman[324695]: 2026-02-23 10:09:46.033724659 +0000 UTC m=+0.198531218 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7)
Feb 23 10:09:46 np0005626463.localdomain podman[324695]: 2026-02-23 10:09:46.051198679 +0000 UTC m=+0.216005248 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 23 10:09:46 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.297 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.299 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.299 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.300 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.330 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:46.331 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:46 np0005626463.localdomain ceph-mon[294160]: pgmap v739: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:47.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:47.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 23 10:09:47 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:47.081 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 23 10:09:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:48 np0005626463.localdomain ceph-mon[294160]: pgmap v740: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 23 10:09:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:09:48.570 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:09:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:09:48.570 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:09:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:09:48.571 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:09:50 np0005626463.localdomain ceph-mon[294160]: pgmap v741: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.332 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.335 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.335 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.335 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.371 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:51 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:51.373 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:09:53 np0005626463.localdomain ceph-mon[294160]: pgmap v742: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 23 10:09:53 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:54 np0005626463.localdomain ceph-mon[294160]: pgmap v743: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:56.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:09:56 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:09:56.373 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:09:56 np0005626463.localdomain ceph-mon[294160]: pgmap v744: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:58 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:09:58 np0005626463.localdomain ceph-mon[294160]: pgmap v745: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:09:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:09:59 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:09:59 np0005626463.localdomain podman[324740]: 2026-02-23 10:09:59.947808726 +0000 UTC m=+0.122838087 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 23 10:09:59 np0005626463.localdomain podman[324741]: 2026-02-23 10:09:59.918747838 +0000 UTC m=+0.089201859 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 23 10:09:59 np0005626463.localdomain podman[324741]: 2026-02-23 10:09:59.999394672 +0000 UTC m=+0.169848683 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:10:00 np0005626463.localdomain ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 23 10:10:00 np0005626463.localdomain podman[324740]: 2026-02-23 10:10:00.013037833 +0000 UTC m=+0.188067164 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216)
Feb 23 10:10:00 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:10:00 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:10:00 np0005626463.localdomain ceph-mon[294160]: pgmap v746: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:00 np0005626463.localdomain ceph-mon[294160]: overall HEALTH_OK
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.375 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.377 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.434 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.808 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.837 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.838 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.839 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:10:01 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:01.867 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:10:02 np0005626463.localdomain ceph-mon[294160]: pgmap v747: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:03 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:04 np0005626463.localdomain ceph-mon[294160]: pgmap v748: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:10:05 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:10:05 np0005626463.localdomain podman[324789]: 2026-02-23 10:10:05.923018675 +0000 UTC m=+0.091920962 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 23 10:10:05 np0005626463.localdomain podman[324788]: 2026-02-23 10:10:05.972949479 +0000 UTC m=+0.144462387 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 23 10:10:05 np0005626463.localdomain podman[324789]: 2026-02-23 10:10:05.994491485 +0000 UTC m=+0.163393752 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 23 10:10:06 np0005626463.localdomain podman[324788]: 2026-02-23 10:10:06.007520578 +0000 UTC m=+0.179033496 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:10:06 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:10:06 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.434 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.436 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.437 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.437 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:06 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:06.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:06 np0005626463.localdomain ceph-mon[294160]: pgmap v749: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:08 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:08 np0005626463.localdomain ceph-mon[294160]: pgmap v750: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:09 np0005626463.localdomain podman[242954]: time="2026-02-23T10:10:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:10:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:10:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:10:09 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:10:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18832 "" "Go-http-client/1.1"
Feb 23 10:10:10 np0005626463.localdomain ceph-mon[294160]: pgmap v751: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.469 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.470 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.471 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.471 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.472 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:11 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:11.474 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:12 np0005626463.localdomain ceph-mon[294160]: pgmap v752: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:13 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:10:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:10:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:10:13 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:10:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:10:13 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:10:14 np0005626463.localdomain ceph-mon[294160]: pgmap v753: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.477 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.477 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.478 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.515 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:16 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:16.516 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:16 np0005626463.localdomain ceph-mon[294160]: pgmap v754: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:10:16 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:10:16 np0005626463.localdomain podman[324824]: 2026-02-23 10:10:16.9203495 +0000 UTC m=+0.091674226 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, managed_by=edpm_ansible)
Feb 23 10:10:16 np0005626463.localdomain podman[324824]: 2026-02-23 10:10:16.933360412 +0000 UTC m=+0.104685118 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:10:16 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:10:17 np0005626463.localdomain podman[324825]: 2026-02-23 10:10:17.02095039 +0000 UTC m=+0.186949261 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 23 10:10:17 np0005626463.localdomain podman[324825]: 2026-02-23 10:10:17.032124345 +0000 UTC m=+0.198123206 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 23 10:10:17 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:10:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2448140031' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 23 10:10:17 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.32:0/2448140031' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 23 10:10:18 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:18 np0005626463.localdomain ceph-mon[294160]: pgmap v755: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:20 np0005626463.localdomain sshd[324865]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:10:20 np0005626463.localdomain sshd[324865]: Accepted publickey for zuul from 192.168.122.10 port 34776 ssh2: RSA SHA256:/ShS2J5Dq7o9P59e/NmgQORSAcJOBwu46Huo03HBdB4
Feb 23 10:10:20 np0005626463.localdomain systemd-logind[759]: New session 82 of user zuul.
Feb 23 10:10:20 np0005626463.localdomain systemd[1]: Started Session 82 of User zuul.
Feb 23 10:10:20 np0005626463.localdomain sshd[324865]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 23 10:10:20 np0005626463.localdomain sudo[324869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Feb 23 10:10:20 np0005626463.localdomain sudo[324869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 23 10:10:20 np0005626463.localdomain ceph-mon[294160]: pgmap v756: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.522 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.522 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.522 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.556 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:21 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:21.556 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:21 np0005626463.localdomain ceph-mon[294160]: pgmap v757: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:23 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: from='client.58888 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: from='client.68969 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: from='client.49002 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: pgmap v758: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/656278644' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "status"} v 0)
Feb 23 10:10:24 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2674339267' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 23 10:10:24 np0005626463.localdomain sshd[325073]: main: sshd: ssh-rsa algorithm is disabled
Feb 23 10:10:24 np0005626463.localdomain sshd[325073]: error: kex_exchange_identification: Connection closed by remote host
Feb 23 10:10:24 np0005626463.localdomain sshd[325073]: Connection closed by 143.198.30.3 port 44348
Feb 23 10:10:25 np0005626463.localdomain ceph-mon[294160]: from='client.58894 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:25 np0005626463.localdomain ceph-mon[294160]: from='client.68978 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:25 np0005626463.localdomain ceph-mon[294160]: from='client.49008 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3077362282' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 23 10:10:25 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2674339267' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 23 10:10:26 np0005626463.localdomain ceph-mon[294160]: pgmap v759: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:26.557 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:26 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:26.561 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:27 np0005626463.localdomain ovs-vsctl[325121]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 23 10:10:27 np0005626463.localdomain virtqemud[207530]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 23 10:10:28 np0005626463.localdomain virtqemud[207530]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 23 10:10:28 np0005626463.localdomain virtqemud[207530]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 23 10:10:28 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:28 np0005626463.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 325275 (lsinitrd)
Feb 23 10:10:28 np0005626463.localdomain systemd[1]: Mounting EFI System Partition Automount...
Feb 23 10:10:28 np0005626463.localdomain systemd[1]: Mounted EFI System Partition Automount.
Feb 23 10:10:28 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: cache status {prefix=cache status} (starting...)
Feb 23 10:10:28 np0005626463.localdomain ceph-mon[294160]: pgmap v760: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:28 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: client ls {prefix=client ls} (starting...)
Feb 23 10:10:28 np0005626463.localdomain lvm[325368]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 23 10:10:28 np0005626463.localdomain lvm[325368]: VG ceph_vg1 finished
Feb 23 10:10:28 np0005626463.localdomain lvm[325375]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 23 10:10:28 np0005626463.localdomain lvm[325375]: VG ceph_vg0 finished
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: damage ls {prefix=damage ls} (starting...)
Feb 23 10:10:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump loads {prefix=dump loads} (starting...)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2169730671' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.58906 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.68993 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.58912 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2334828674' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2536923091' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3105185597' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2169730671' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "report"} v 0)
Feb 23 10:10:29 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1399630349' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:29 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3737416543' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config log"} v 0)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/840476199' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.68999 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.49026 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.49041 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.58936 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.69023 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: pgmap v761: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3975561366' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1399630349' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2826152229' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4256708368' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/247694003' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3737416543' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1372337432' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3087407739' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1388500024' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2072617387' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/502753632' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/840476199' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2205330165' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.
Feb 23 10:10:30 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.
Feb 23 10:10:30 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: ops {prefix=ops} (starting...)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 23 10:10:30 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3846767999' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:30 np0005626463.localdomain podman[325651]: 2026-02-23 10:10:30.907818895 +0000 UTC m=+0.080513241 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 23 10:10:30 np0005626463.localdomain podman[325650]: 2026-02-23 10:10:30.965451297 +0000 UTC m=+0.138130862 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible)
Feb 23 10:10:30 np0005626463.localdomain podman[325651]: 2026-02-23 10:10:30.994751172 +0000 UTC m=+0.167445528 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 23 10:10:31 np0005626463.localdomain systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully.
Feb 23 10:10:31 np0005626463.localdomain podman[325650]: 2026-02-23 10:10:31.040182696 +0000 UTC m=+0.212862281 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2642635724' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully.
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1146500256' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session ls {prefix=session ls} (starting...)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.560 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:31 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:31.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:31 np0005626463.localdomain ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: status {prefix=status} (starting...)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.49071 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.58981 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2205330165' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/604197490' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.69083 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3846767999' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.58993 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2642635724' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/132847898' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/838413950' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.69101 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1146500256' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/840532855' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.49140 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: pgmap v762: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3451573815' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1233806699' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1781676493' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 23 10:10:31 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1098802984' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3427895369' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "features"} v 0)
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/547775962' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.541 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.542 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.542 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 23 10:10:32 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:32.542 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2150818902' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1098802984' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/318422905' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.49155 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2496943267' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1096821994' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3786663329' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.59038 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3427895369' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/547775962' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.69152 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3591565141' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3430672046' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/422259144' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2150818902' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Feb 23 10:10:32 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1535678785' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2021780763' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:33.088 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 23 10:10:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:33.105 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 23 10:10:33 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:33.106 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.284213) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433284245, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1284, "num_deletes": 255, "total_data_size": 1323525, "memory_usage": 1351360, "flush_reason": "Manual Compaction"}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433290076, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1299107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41358, "largest_seqno": 42641, "table_properties": {"data_size": 1293598, "index_size": 2788, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13080, "raw_average_key_size": 20, "raw_value_size": 1281913, "raw_average_value_size": 1981, "num_data_blocks": 123, "num_entries": 647, "num_filter_entries": 647, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841337, "oldest_key_time": 1771841337, "file_creation_time": 1771841433, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 5891 microseconds, and 2047 cpu microseconds.
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.290103) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1299107 bytes OK
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.290118) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.293713) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.293724) EVENT_LOG_v1 {"time_micros": 1771841433293721, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.293740) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1317570, prev total WAL file size 1317894, number of live WAL files 2.
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.294279) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353234' seq:72057594037927935, type:22 .. '6C6F676D0034373735' seq:0, type:0; will stop at (end)
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1268KB)], [78(16MB)]
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433294316, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 18737246, "oldest_snapshot_seqno": -1}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 14544 keys, 18604837 bytes, temperature: kUnknown
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433400326, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 18604837, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18522358, "index_size": 45002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36421, "raw_key_size": 392484, "raw_average_key_size": 26, "raw_value_size": 18275728, "raw_average_value_size": 1256, "num_data_blocks": 1659, "num_entries": 14544, "num_filter_entries": 14544, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841433, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.400660) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 18604837 bytes
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.402777) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.6 rd, 175.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.6 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(28.7) write-amplify(14.3) OK, records in: 15077, records dropped: 533 output_compression: NoCompression
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.402804) EVENT_LOG_v1 {"time_micros": 1771841433402792, "job": 48, "event": "compaction_finished", "compaction_time_micros": 106106, "compaction_time_cpu_micros": 32330, "output_level": 6, "num_output_files": 1, "total_output_size": 18604837, "num_input_records": 15077, "num_output_records": 14544, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433403130, "job": 48, "event": "table_file_deletion", "file_number": 80}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841433406057, "job": 48, "event": "table_file_deletion", "file_number": 78}
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.294192) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.406147) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.406154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.406156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.406159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:10:33.406160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2249013292' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1535678785' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/782860407' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.59068 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.49200 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2021780763' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3330005011' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.49215 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.69182 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.59080 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.49221 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2563891923' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: pgmap v763: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1871381033' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.69200 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2249013292' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 23 10:10:33 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2293136373' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2242601199' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1463588209' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.59092 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.49236 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2293136373' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3525939574' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.69215 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2242601199' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.59107 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.49257 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/402796184' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2302957154' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.69242 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2945892897' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.59128 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:34 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1463588209' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:50.112056+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:51.112185+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:52.112309+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 910181 data_alloc: 184549376 data_used: 15441920
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:53.112442+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:54.112583+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:55.112742+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:56.112943+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient:  got monmap 15 from mon.np0005626463 (according to old e15)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: dump:
                                                          epoch 15
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:49:26.924061+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:57.113124+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 910181 data_alloc: 184549376 data_used: 15441920
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:58.113237+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:59.113367+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:00.113510+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:01.113651+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:02.113785+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 910181 data_alloc: 184549376 data_used: 15441920
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:03.113920+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:04.114088+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:05.114249+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:06.114418+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:07.114563+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 910181 data_alloc: 184549376 data_used: 15441920
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:08.114688+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 heartbeat osd_stat(store_statfs(0x1b90a0000/0x0/0x1bfc00000, data 0x2971399/0x29ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:09.114826+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:10.114982+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 33
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97329152 unmapped: 1196032 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 34
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now 
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/1055095676
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect No active mgr available yet
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 ms_handle_reset con 0x564b598fb800 session 0x564b5b06d680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:11.115148+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 93.286582947s of 93.345413208s, submitted: 12
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909b000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97484800 unmapped: 1040384 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 35
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: get_auth_request con 0x564b57d3e800 auth_method 0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_configure stats_period=5
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:12.115296+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909b000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97648640 unmapped: 876544 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 914381 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909b000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 36
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:13.115451+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:14.115626+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 37
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:15.115818+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:16.115987+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 38
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:17.116141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:18.116305+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:19.116478+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:20.116637+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:21.116793+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:22.116970+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:23.117754+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:24.118102+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:25.118376+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:26.118561+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:27.118955+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:28.119226+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:29.119635+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:30.119969+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:31.120205+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:32.120383+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:33.120671+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:34.120943+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:35.121173+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97665024 unmapped: 860160 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:36.121447+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 39
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:37.121587+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:38.121945+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:39.122192+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:40.122557+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:41.122769+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:42.123025+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:43.123241+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:44.123461+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:45.123678+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:46.123974+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:47.124202+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:48.125090+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:49.125273+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:50.125466+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:51.125629+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:52.125762+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:53.125938+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:54.126062+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:55.126235+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:56.126425+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:57.126509+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:58.126653+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:59.126973+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:00.127130+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:01.127331+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:02.127517+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:03.127717+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:04.127921+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:05.128093+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:06.128339+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:07.128599+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:08.128814+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:09.128953+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:10.129082+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:11.129186+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:12.129325+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:13.129515+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:14.129606+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:15.129712+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:16.129950+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:17.130128+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:18.130282+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:19.130426+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:20.130622+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:21.130843+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:22.131002+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:23.131174+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:24.131339+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:25.131705+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:26.131959+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97673216 unmapped: 851968 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:27.132103+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:28.132336+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:29.132474+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:30.132847+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:31.133170+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:32.133393+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 913501 data_alloc: 184549376 data_used: 15450112
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:33.133591+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97656832 unmapped: 868352 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:34.133734+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now 
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/1471406
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect No active mgr available yet
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 heartbeat osd_stat(store_statfs(0x1b909c000/0x0/0x1bfc00000, data 0x2973593/0x29f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 handle_osd_map epochs [86,86], i have 86, src has [1,86]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 handle_osd_map epochs [86,86], i have 86, src has [1,86]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 85 handle_osd_map epochs [86,86], i have 86, src has [1,86]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 83.789047241s of 83.843292236s, submitted: 12
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 ms_handle_reset con 0x564b5b153800 session 0x564b5b1b4000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57530000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:35.133903+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97869824 unmapped: 655360 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 41
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: get_auth_request con 0x564b59a85000 auth_method 0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_configure stats_period=5
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:36.134073+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97533952 unmapped: 991232 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9097000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:37.134227+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97533952 unmapped: 991232 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 42
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:38.134339+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97533952 unmapped: 991232 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:39.134526+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97533952 unmapped: 991232 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 43
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:40.134684+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97542144 unmapped: 983040 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 44
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:41.134843+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:42.135006+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:43.135152+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:44.135313+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:45.135488+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:46.135684+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:47.135863+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:48.136027+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:49.136193+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:50.136330+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:51.136477+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:52.136620+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:53.136753+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:54.136919+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:55.137077+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:56.137244+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:57.137394+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:58.137537+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:59.137676+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:00.137856+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:01.138077+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:02.138256+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:03.138465+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:04.138610+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:05.138754+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:06.138943+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:07.139111+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:08.139317+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:09.139454+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:10.139612+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:11.139745+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:12.139921+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:13.140109+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:14.140297+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:15.140434+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:16.140604+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:17.140748+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:18.140915+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:19.141073+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:20.141380+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:21.141574+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:22.141767+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:23.141925+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:24.142053+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:25.142253+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:26.142456+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:27.142628+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:28.142791+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:29.142971+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:30.143141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:31.143306+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:32.143481+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:33.143643+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:34.143790+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:35.143970+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:36.144190+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:37.144335+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:38.144496+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:39.144662+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:40.144806+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:41.145013+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:42.145182+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:43.145293+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:44.145419+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:45.145555+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:46.145734+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:47.145894+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:48.146076+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:49.146245+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:50.146405+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:51.146589+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:52.146742+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:53.146908+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:54.147115+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:55.147267+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:56.147428+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:57.147625+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:58.147754+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:59.147896+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:00.148075+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:01.148220+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:02.148407+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:03.148522+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 97132544 unmapped: 1392640 heap: 98525184 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 916821 data_alloc: 184549376 data_used: 15462400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 88.429977417s of 88.484077454s, submitted: 12
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 heartbeat osd_stat(store_statfs(0x1b9098000/0x0/0x1bfc00000, data 0x297593b/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:04.148654+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96100352 unmapped: 16064512 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:05.148813+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 87 ms_handle_reset con 0x564b59a85800 session 0x564b5b1b4d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96198656 unmapped: 15966208 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:06.148962+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 87 heartbeat osd_stat(store_statfs(0x1b8090000/0x0/0x1bfc00000, data 0x3977d33/0x39fd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96223232 unmapped: 15941632 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:07.149118+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 45
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 handle_osd_map epochs [88,88], i have 88, src has [1,88]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 handle_osd_map epochs [88,88], i have 88, src has [1,88]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96509952 unmapped: 15654912 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:08.149258+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:09.149414+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:10.149641+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:11.149855+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:12.150082+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:13.150281+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:14.150488+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:15.150684+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:16.150941+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:17.151141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:18.151340+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:19.151554+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:20.151783+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:21.151982+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:22.152214+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:23.152439+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:24.152668+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:25.152863+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:26.153119+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:27.153311+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:28.153479+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:29.153715+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:30.153925+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:31.154088+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:32.154259+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:33.154444+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:34.154601+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:35.154741+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:36.154920+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:37.155069+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:38.155170+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:39.155355+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:40.155514+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:41.155616+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:42.155765+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:43.155917+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:44.156074+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:45.156217+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:46.156400+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:47.156549+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:48.156642+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:49.156747+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:50.156929+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:51.157069+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:52.157226+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:53.157367+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1043110 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:54.157524+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b808a000/0x0/0x1bfc00000, data 0x397a0f8/0x3a02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:55.157669+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:56.157859+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:57.157980+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c0800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 54.312435150s of 54.537883759s, submitted: 37
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b5a2c0800 session 0x564b57c810e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:58.158126+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1040986 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96518144 unmapped: 15646720 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b57d3e800 session 0x564b5833a1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:59.158311+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 102801408 unmapped: 9363456 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b598fb800 session 0x564b59a785a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:00.158436+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96722944 unmapped: 15441920 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b7493000/0x0/0x1bfc00000, data 0x4572121/0x45fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:01.158564+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b59a85800 session 0x564b5aee3860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96722944 unmapped: 15441920 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:02.158702+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b5b153800 session 0x564b5aee2960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96722944 unmapped: 15441920 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:03.158858+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1130983 data_alloc: 184549376 data_used: 12853248
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96722944 unmapped: 15441920 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59896400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b59896400 session 0x564b583441e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 ms_handle_reset con 0x564b57d3e800 session 0x564b59f8d2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:04.159017+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 96763904 unmapped: 15400960 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:05.159290+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 101425152 unmapped: 10739712 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b7492000/0x0/0x1bfc00000, data 0x457217d/0x45fc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:06.159497+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b7492000/0x0/0x1bfc00000, data 0x457217d/0x45fc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:07.159671+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:08.159829+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1184505 data_alloc: 184549376 data_used: 20004864
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:09.160026+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:10.160206+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b7492000/0x0/0x1bfc00000, data 0x457217d/0x45fc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:11.160403+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103161856 unmapped: 9003008 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:12.160626+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 heartbeat osd_stat(store_statfs(0x1b7492000/0x0/0x1bfc00000, data 0x457217d/0x45fc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103170048 unmapped: 8994816 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 14.692027092s of 14.996329308s, submitted: 43
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:13.160785+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1186319 data_alloc: 184549376 data_used: 20004864
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103194624 unmapped: 8970240 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:14.160935+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 88 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 89 ms_handle_reset con 0x564b5b153800 session 0x564b57c825a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 89 heartbeat osd_stat(store_statfs(0x1b7140000/0x0/0x1bfc00000, data 0x48b7552/0x4945000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 105414656 unmapped: 6750208 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:15.161070+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 105570304 unmapped: 6594560 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:16.161228+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 105914368 unmapped: 6250496 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 90 ms_handle_reset con 0x564b5b153c00 session 0x564b595e9c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:17.161369+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 105013248 unmapped: 7151616 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 90 heartbeat osd_stat(store_statfs(0x1b6e8b000/0x0/0x1bfc00000, data 0x4b73948/0x4c03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:18.161513+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1246584 data_alloc: 184549376 data_used: 19955712
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 105013248 unmapped: 7151616 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 91 ms_handle_reset con 0x564b59a85800 session 0x564b57a8b4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 91 ms_handle_reset con 0x564b598fb800 session 0x564b59ff7c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:19.161662+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 91 ms_handle_reset con 0x564b598fe800 session 0x564b5833fa40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104996864 unmapped: 7168000 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 91 heartbeat osd_stat(store_statfs(0x1b6e88000/0x0/0x1bfc00000, data 0x4b75d0b/0x4c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2832297312' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:20.161787+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104996864 unmapped: 7168000 heap: 112164864 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 ms_handle_reset con 0x564b57d3e800 session 0x564b5a30c5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 ms_handle_reset con 0x564b5b153800 session 0x564b584934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 ms_handle_reset con 0x564b59a85800 session 0x564b5a30c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:21.161925+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b6e3e000/0x0/0x1bfc00000, data 0x4bbd12b/0x4c4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110379008 unmapped: 26296320 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 ms_handle_reset con 0x564b5b153c00 session 0x564b5a30cf00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:22.162080+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 heartbeat osd_stat(store_statfs(0x1b4f73000/0x0/0x1bfc00000, data 0x6a8a0c9/0x6b1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110379008 unmapped: 26296320 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 93 ms_handle_reset con 0x564b5b153c00 session 0x564b57c810e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b4f73000/0x0/0x1bfc00000, data 0x6a8a0c9/0x6b1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:23.162211+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1511348 data_alloc: 184549376 data_used: 23756800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110182400 unmapped: 26492928 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.875823021s of 10.757616997s, submitted: 192
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 ms_handle_reset con 0x564b57d3e800 session 0x564b5b6e0960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:24.162361+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 ms_handle_reset con 0x564b598fe800 session 0x564b5aee3e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110288896 unmapped: 26386432 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 ms_handle_reset con 0x564b59a85800 session 0x564b5aee2960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:25.162531+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 ms_handle_reset con 0x564b5b153800 session 0x564b5b6e0b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110026752 unmapped: 26648576 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:26.162740+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 95 heartbeat osd_stat(store_statfs(0x1b4f6a000/0x0/0x1bfc00000, data 0x6a8e917/0x6b24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110043136 unmapped: 26632192 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 95 ms_handle_reset con 0x564b57d3e800 session 0x564b5b6e1a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:27.162825+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103284736 unmapped: 33390592 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:28.162974+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1327006 data_alloc: 184549376 data_used: 12836864
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103284736 unmapped: 33390592 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:29.163117+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103284736 unmapped: 33390592 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:30.163228+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103284736 unmapped: 33390592 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:31.777491+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b598fe800 session 0x564b57a8d860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103391232 unmapped: 33284096 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b59a85800 session 0x564b5bc42000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 heartbeat osd_stat(store_statfs(0x1b5d4b000/0x0/0x1bfc00000, data 0x589bb36/0x5931000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:32.777627+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103407616 unmapped: 33267712 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153000 session 0x564b58493860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153c00 session 0x564b5bc425a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153c00 session 0x564b5bc42d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b57d3e800 session 0x564b5bc42f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153400 session 0x564b58492000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b598fe800 session 0x564b5bc434a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1414767 data_alloc: 184549376 data_used: 12836864
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b59a85800 session 0x564b5bc43860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b57d3e800 session 0x564b595e92c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b59a85800 session 0x564b5bc43e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:33.777753+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b598fe800 session 0x564b59675a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153400 session 0x564b59ff6b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153c00 session 0x564b585b90e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104628224 unmapped: 32047104 heap: 136675328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.767568588s of 10.221001625s, submitted: 135
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b57d3e800 session 0x564b59ff7a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153c00 session 0x564b58342960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:34.777955+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 122093568 unmapped: 22986752 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b598fe800 session 0x564b5965a000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b59a85800 session 0x564b584934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:35.778129+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153400 session 0x564b5a30d0e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 122093568 unmapped: 22986752 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b5b153400 session 0x564b5aee32c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 heartbeat osd_stat(store_statfs(0x1b3a89000/0x0/0x1bfc00000, data 0x7f6ce20/0x8005000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b57d3e800 session 0x564b5aee2960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b598fe800 session 0x564b59591a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 ms_handle_reset con 0x564b59a85800 session 0x564b595903c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:36.778906+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 122626048 unmapped: 22454272 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 97 ms_handle_reset con 0x564b5b153c00 session 0x564b58343c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets getting new tickets!
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:37.779122+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _finish_auth 0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:37.779836+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 22077440 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1519843 data_alloc: 184549376 data_used: 22548480
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:38.779258+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 123355136 unmapped: 21725184 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:39.779379+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 123355136 unmapped: 21725184 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:40.779585+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 123355136 unmapped: 21725184 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:41.779715+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b594f000/0x0/0x1bfc00000, data 0x60a51b4/0x613e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 123387904 unmapped: 21692416 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:42.779903+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 18571264 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1554403 data_alloc: 201326592 data_used: 25694208
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:43.780061+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 18571264 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:44.780226+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 18571264 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.959340096s of 11.483025551s, submitted: 110
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:45.780370+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b5950000/0x0/0x1bfc00000, data 0x60a51b4/0x613e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 25796608 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:46.780765+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119529472 unmapped: 25550848 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 98 heartbeat osd_stat(store_statfs(0x1b594b000/0x0/0x1bfc00000, data 0x60a743c/0x6142000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:47.780911+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119562240 unmapped: 25518080 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1558403 data_alloc: 201326592 data_used: 27459584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:48.781042+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 98 heartbeat osd_stat(store_statfs(0x1b594c000/0x0/0x1bfc00000, data 0x60a743c/0x6142000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119799808 unmapped: 25280512 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:49.782331+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119799808 unmapped: 25280512 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 98 ms_handle_reset con 0x564b5b153000 session 0x564b596703c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:50.782517+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 120848384 unmapped: 24231936 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:51.782716+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 125952000 unmapped: 19128320 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 99 ms_handle_reset con 0x564b57d3e800 session 0x564b57a8d860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:52.782907+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 122159104 unmapped: 22921216 heap: 145080320 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1751153 data_alloc: 201326592 data_used: 31518720
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:53.783035+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141893632 unmapped: 7045120 heap: 148938752 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 99 heartbeat osd_stat(store_statfs(0x1b3e89000/0x0/0x1bfc00000, data 0x7b677ec/0x7c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 99 ms_handle_reset con 0x564b598fe800 session 0x564b57a901e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 99 heartbeat osd_stat(store_statfs(0x1b3e89000/0x0/0x1bfc00000, data 0x7b677ec/0x7c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:54.783184+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130711552 unmapped: 30392320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 heartbeat osd_stat(store_statfs(0x1b2336000/0x0/0x1bfc00000, data 0x96b7be2/0x9757000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:55.783336+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130965504 unmapped: 30138368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.868025780s of 10.648583412s, submitted: 154
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 ms_handle_reset con 0x564b5b153400 session 0x564b596745a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:56.783474+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132251648 unmapped: 28852224 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 ms_handle_reset con 0x564b5a304400 session 0x564b598b90e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:57.783646+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132284416 unmapped: 28819456 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:58.888641+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1952597 data_alloc: 201326592 data_used: 31571968
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131645440 unmapped: 29458432 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b22a3000/0x0/0x1bfc00000, data 0x9742be2/0x97e2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:59.888778+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131645440 unmapped: 29458432 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:00.888956+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 101 ms_handle_reset con 0x564b5a304400 session 0x564b598b8b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116817920 unmapped: 44285952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:01.889091+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116834304 unmapped: 44269568 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:02.889215+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116097024 unmapped: 45006848 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:03.889338+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1664643 data_alloc: 184549376 data_used: 13803520
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 heartbeat osd_stat(store_statfs(0x1b3f8c000/0x0/0x1bfc00000, data 0x7a5e228/0x7b01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116097024 unmapped: 45006848 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:04.889931+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116097024 unmapped: 45006848 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b59a85800 session 0x564b59ff65a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b57d3e800 session 0x564b59f8cf00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b5b153000 session 0x564b57c83680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b5b153400 session 0x564b57a90960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b57d3e800 session 0x564b57a91e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:05.890091+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b59a85800 session 0x564b57c814a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.398721695s of 10.001255035s, submitted: 140
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126230528 unmapped: 34873344 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b598fe800 session 0x564b59f8da40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b5a304400 session 0x564b59f12b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:06.890265+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117514240 unmapped: 43589632 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b5b153000 session 0x564b598b9680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b5b153000 session 0x564b57c830e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:07.890427+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 heartbeat osd_stat(store_statfs(0x1b4098000/0x0/0x1bfc00000, data 0x7953228/0x79f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b57d3e800 session 0x564b57a905a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117514240 unmapped: 43589632 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 ms_handle_reset con 0x564b598fe800 session 0x564b59675680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:08.890571+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1668989 data_alloc: 184549376 data_used: 14454784
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117604352 unmapped: 43499520 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 ms_handle_reset con 0x564b5a304400 session 0x564b585b8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:09.890722+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 111337472 unmapped: 49766400 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 ms_handle_reset con 0x564b59a85800 session 0x564b57c83c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:10.890846+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b8050000/0x0/0x1bfc00000, data 0x399b5ae/0x3a3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 ms_handle_reset con 0x564b59a85800 session 0x564b59ff61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104341504 unmapped: 56762368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:11.890983+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104341504 unmapped: 56762368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:12.891214+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104341504 unmapped: 56762368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:13.891395+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1166528 data_alloc: 184549376 data_used: 3203072
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104341504 unmapped: 56762368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b8050000/0x0/0x1bfc00000, data 0x399b5ae/0x3a3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:14.891597+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b8050000/0x0/0x1bfc00000, data 0x399b5ae/0x3a3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104382464 unmapped: 56721408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 104 ms_handle_reset con 0x564b57d3e800 session 0x564b5833d860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:15.891844+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104382464 unmapped: 56721408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.653216362s of 10.180534363s, submitted: 132
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 105 ms_handle_reset con 0x564b598fe800 session 0x564b5b1b52c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:16.892080+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104480768 unmapped: 56623104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:17.892298+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104480768 unmapped: 56623104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:18.892462+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1178491 data_alloc: 184549376 data_used: 3219456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104480768 unmapped: 56623104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:19.892668+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104480768 unmapped: 56623104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b8042000/0x0/0x1bfc00000, data 0x39a2016/0x3a4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:20.892842+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104480768 unmapped: 56623104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:21.893114+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104488960 unmapped: 56614912 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:22.893341+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:23.893581+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1180821 data_alloc: 184549376 data_used: 3219456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:24.894076+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:25.894216+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:26.894475+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:27.894745+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:28.894950+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1180821 data_alloc: 184549376 data_used: 3219456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:29.895224+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:30.895496+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:31.895658+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:32.895814+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:33.896029+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1180821 data_alloc: 184549376 data_used: 3219456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104538112 unmapped: 56565760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:34.896197+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104562688 unmapped: 56541184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:35.896352+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104562688 unmapped: 56541184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:36.896551+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104562688 unmapped: 56541184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:37.896708+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104562688 unmapped: 56541184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:38.896908+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1180981 data_alloc: 184549376 data_used: 3223552
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b803f000/0x0/0x1bfc00000, data 0x39a429e/0x3a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b5a304400 session 0x564b5965ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b5b153000 session 0x564b595e9e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b5b153000 session 0x564b5965b680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104562688 unmapped: 56541184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b57d3e800 session 0x564b583430e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 23.180376053s of 23.283435822s, submitted: 59
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:39.897124+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b598fe800 session 0x564b57a8c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b59a85800 session 0x564b59aa7680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b5a304400 session 0x564b5833e780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b5a304400 session 0x564b57a8ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b57d3e800 session 0x564b598b8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103251968 unmapped: 57851904 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:40.897271+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103292928 unmapped: 57810944 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:41.897428+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103292928 unmapped: 57810944 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:42.897606+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b6f9b000/0x0/0x1bfc00000, data 0x4a4929e/0x4af3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103301120 unmapped: 57802752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:43.897764+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1306518 data_alloc: 184549376 data_used: 3223552
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103301120 unmapped: 57802752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:44.897926+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103301120 unmapped: 57802752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:45.898122+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b6f9b000/0x0/0x1bfc00000, data 0x4a4929e/0x4af3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 ms_handle_reset con 0x564b598fe800 session 0x564b59aa7860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103489536 unmapped: 57614336 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:46.898287+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 103481344 unmapped: 57622528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:47.898422+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:48.898563+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1343119 data_alloc: 184549376 data_used: 7516160
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b6f70000/0x0/0x1bfc00000, data 0x4a732ae/0x4b1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:49.898717+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:50.898847+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b6f70000/0x0/0x1bfc00000, data 0x4a732ae/0x4b1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:51.898972+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:52.899092+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:53.899378+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1343119 data_alloc: 184549376 data_used: 7516160
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:54.899526+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 104333312 unmapped: 56770560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b6f70000/0x0/0x1bfc00000, data 0x4a732ae/0x4b1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:55.899671+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 16.140342712s of 16.325368881s, submitted: 35
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110075904 unmapped: 51027968 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:56.899818+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108208128 unmapped: 52895744 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:57.899952+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108486656 unmapped: 52617216 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:58.900072+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1432779 data_alloc: 184549376 data_used: 7671808
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108494848 unmapped: 52609024 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:59.900157+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108494848 unmapped: 52609024 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:00.900293+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b652e000/0x0/0x1bfc00000, data 0x54b52ae/0x5560000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108494848 unmapped: 52609024 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:01.900440+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108511232 unmapped: 52592640 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b652e000/0x0/0x1bfc00000, data 0x54b52ae/0x5560000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:02.900583+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 108 ms_handle_reset con 0x564b59ff9c00 session 0x564b59aa6960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108593152 unmapped: 52510720 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:03.900697+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1442484 data_alloc: 184549376 data_used: 7692288
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 108 heartbeat osd_stat(store_statfs(0x1b6525000/0x0/0x1bfc00000, data 0x54b7aa6/0x5568000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108675072 unmapped: 52428800 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:04.900832+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109977600 unmapped: 51126272 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:05.901086+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.812421799s of 10.248526573s, submitted: 94
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b1b6400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109002752 unmapped: 52101120 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 ms_handle_reset con 0x564b5b1b6400 session 0x564b596714a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:06.901268+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109027328 unmapped: 52076544 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b64f9000/0x0/0x1bfc00000, data 0x54dc23e/0x5590000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 ms_handle_reset con 0x564b59a85800 session 0x564b5833f680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 ms_handle_reset con 0x564b5b153000 session 0x564b5833e1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:07.901406+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 111 ms_handle_reset con 0x564b57d3e800 session 0x564b596705a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 106823680 unmapped: 54280192 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 111 ms_handle_reset con 0x564b598fe800 session 0x564b5c4494a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:08.901727+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1215477 data_alloc: 184549376 data_used: 3260416
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 106823680 unmapped: 54280192 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:09.901993+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 111 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 112 ms_handle_reset con 0x564b59ff9c00 session 0x564b5b06c5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 106905600 unmapped: 54198272 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:10.902141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b7c27000/0x0/0x1bfc00000, data 0x39afb66/0x3a67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 112 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 113 ms_handle_reset con 0x564b59ff9c00 session 0x564b5774d2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 54181888 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:11.902272+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 ms_handle_reset con 0x564b57d3e800 session 0x564b5774dc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 ms_handle_reset con 0x564b59a85800 session 0x564b59f8da40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 ms_handle_reset con 0x564b5b153000 session 0x564b5bc42b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 ms_handle_reset con 0x564b5a304400 session 0x564b5bc421e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 ms_handle_reset con 0x564b57d3e800 session 0x564b57a90f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b71ad000/0x0/0x1bfc00000, data 0x441e6bc/0x44df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108134400 unmapped: 52969472 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:12.902456+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 115 ms_handle_reset con 0x564b598fe800 session 0x564b598b9680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 108191744 unmapped: 52912128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:13.902628+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 116 heartbeat osd_stat(store_statfs(0x1b71a5000/0x0/0x1bfc00000, data 0x4422e94/0x44e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 116 ms_handle_reset con 0x564b59a85800 session 0x564b5bc42960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1331168 data_alloc: 184549376 data_used: 3284992
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 107208704 unmapped: 53895168 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 116 heartbeat osd_stat(store_statfs(0x1b71a5000/0x0/0x1bfc00000, data 0x4422e94/0x44e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:14.902777+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 117 ms_handle_reset con 0x564b59ff9c00 session 0x564b598b90e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b153000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 117 ms_handle_reset con 0x564b5b153000 session 0x564b5a30c000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 107307008 unmapped: 53796864 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 117 ms_handle_reset con 0x564b57d3e800 session 0x564b5c449860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:15.902986+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 118 ms_handle_reset con 0x564b598fe800 session 0x564b59671860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.990425110s of 10.109376907s, submitted: 279
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 118 ms_handle_reset con 0x564b59a85800 session 0x564b57c82d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 107380736 unmapped: 53723136 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:16.903294+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 107397120 unmapped: 53706752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:17.903413+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 120 handle_osd_map epochs [119,120], i have 120, src has [1,120]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 120 ms_handle_reset con 0x564b59ff8000 session 0x564b58342d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109264896 unmapped: 51838976 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 120 heartbeat osd_stat(store_statfs(0x1b7194000/0x0/0x1bfc00000, data 0x442a323/0x44f9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fac00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:18.903552+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1393927 data_alloc: 184549376 data_used: 8617984
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 121 ms_handle_reset con 0x564b598fac00 session 0x564b59ff6000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109297664 unmapped: 51806208 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:19.903722+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3e800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 122 heartbeat osd_stat(store_statfs(0x1b7191000/0x0/0x1bfc00000, data 0x442e094/0x44fd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 122 ms_handle_reset con 0x564b57d3e800 session 0x564b5aee2f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109322240 unmapped: 51781632 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:20.903914+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109371392 unmapped: 51732480 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 123 ms_handle_reset con 0x564b598fe800 session 0x564b585b9e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:21.904062+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109387776 unmapped: 51716096 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b718a000/0x0/0x1bfc00000, data 0x44323a2/0x4500000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:22.904205+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 124 ms_handle_reset con 0x564b59a85800 session 0x564b585b8b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 109404160 unmapped: 51699712 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:23.904810+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1398942 data_alloc: 184549376 data_used: 8630272
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 ms_handle_reset con 0x564b59ff8000 session 0x564b58493860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110501888 unmapped: 50601984 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:24.904960+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110526464 unmapped: 50577408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 heartbeat osd_stat(store_statfs(0x1b7187000/0x0/0x1bfc00000, data 0x44365b4/0x4503000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:25.905106+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110526464 unmapped: 50577408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.319211960s of 10.273212433s, submitted: 261
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 ms_handle_reset con 0x564b59ff9c00 session 0x564b5833c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 ms_handle_reset con 0x564b582e1400 session 0x564b598b85a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:26.905298+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 110526464 unmapped: 50577408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:27.905445+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117563392 unmapped: 43540480 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:28.905592+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1513553 data_alloc: 184549376 data_used: 9048064
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117506048 unmapped: 43597824 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:29.905735+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b6225000/0x0/0x1bfc00000, data 0x539a858/0x5469000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,1,0,4])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117014528 unmapped: 44089344 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:30.905924+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117022720 unmapped: 44081152 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:31.906072+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:32.906245+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:33.906457+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1522789 data_alloc: 184549376 data_used: 8884224
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b620f000/0x0/0x1bfc00000, data 0x53adae0/0x547e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:34.906934+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:35.907100+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:36.907320+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116826112 unmapped: 44277760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:37.907477+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.135357857s of 11.639736176s, submitted: 162
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 ms_handle_reset con 0x564b59897000 session 0x564b5b06c780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 ms_handle_reset con 0x564b582e0c00 session 0x564b585b8f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116834304 unmapped: 44269568 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:38.907613+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 ms_handle_reset con 0x564b598fe800 session 0x564b59a78000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:39.907743+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:40.909970+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:41.910121+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:42.910274+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:43.910432+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:44.910602+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:45.910773+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:46.910975+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:47.911113+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:48.911288+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:49.911431+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:50.911544+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:51.911692+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:52.911840+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:53.912179+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:54.912317+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:55.912437+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:56.912608+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:57.912751+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:58.912928+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:59.913082+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:00.913247+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:01.913396+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:02.913560+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:03.913699+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:04.913831+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:05.913957+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:06.914133+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:07.914261+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:08.914402+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:09.914544+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:10.914683+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:11.914972+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:12.915160+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:13.915302+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:14.915426+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:15.915571+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:16.915743+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:17.915921+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:18.916043+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:19.916192+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:20.916349+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:21.916511+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:22.916659+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:23.916785+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:24.916922+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:25.917059+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:26.917232+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:27.917367+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:28.917528+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:29.917689+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:30.917839+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:31.917968+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:32.918154+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:33.918304+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:34.918449+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:35.918620+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:36.918797+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:37.918950+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:38.919099+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:39.919257+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:40.919386+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113745920 unmapped: 47357952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:41.919525+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:42.919648+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:43.919775+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:44.919931+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:45.920122+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:46.920298+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:47.920474+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:48.920615+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:49.920744+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:50.920907+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:51.921042+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 heartbeat osd_stat(store_statfs(0x1b7ba5000/0x0/0x1bfc00000, data 0x39d0a5e/0x3a9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:52.921179+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:53.921340+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1294536 data_alloc: 184549376 data_used: 3309568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 76.214950562s of 76.465080261s, submitted: 47
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:54.921491+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 ms_handle_reset con 0x564b59a85800 session 0x564b5bc43680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113549312 unmapped: 47554560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:55.921626+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 heartbeat osd_stat(store_statfs(0x1b7be8000/0x0/0x1bfc00000, data 0x39d3232/0x3aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113549312 unmapped: 47554560 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:56.921782+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 heartbeat osd_stat(store_statfs(0x1b7be8000/0x0/0x1bfc00000, data 0x39d3232/0x3aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 47562752 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:57.921904+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114622464 unmapped: 46481408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:58.921983+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 heartbeat osd_stat(store_statfs(0x1b5be8000/0x0/0x1bfc00000, data 0x59d3256/0x5aa6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1523893 data_alloc: 184549376 data_used: 3321856
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114663424 unmapped: 46440448 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:59.922100+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114540544 unmapped: 46563328 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:00.922231+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114540544 unmapped: 46563328 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:01.922392+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 heartbeat osd_stat(store_statfs(0x1b53e8000/0x0/0x1bfc00000, data 0x61d3258/0x62a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114556928 unmapped: 46546944 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:02.922529+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114581504 unmapped: 46522368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:03.922649+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 129 ms_handle_reset con 0x564b59a85800 session 0x564b59675c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1804175 data_alloc: 184549376 data_used: 3334144
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114647040 unmapped: 46456832 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:04.922757+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.130503654s of 10.457572937s, submitted: 40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [129,130], i have 130, src has [1,130]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [129,130], i have 130, src has [1,130]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 ms_handle_reset con 0x564b582e0c00 session 0x564b59aa63c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114704384 unmapped: 46399488 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:05.922928+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 heartbeat osd_stat(store_statfs(0x1b7bdf000/0x0/0x1bfc00000, data 0x39d79ca/0x3aad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 130 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 131 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 131 ms_handle_reset con 0x564b582e1400 session 0x564b584923c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:06.923065+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:07.923223+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 131 heartbeat osd_stat(store_statfs(0x1b7bdd000/0x0/0x1bfc00000, data 0x39d998e/0x3aae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:08.923323+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1325284 data_alloc: 184549376 data_used: 3350528
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:09.923483+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:10.923695+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:11.923845+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114720768 unmapped: 46383104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:12.923962+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114745344 unmapped: 46358528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 132 heartbeat osd_stat(store_statfs(0x1b7bdb000/0x0/0x1bfc00000, data 0x39dbc16/0x3ab2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:13.924108+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114745344 unmapped: 46358528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1331259 data_alloc: 184549376 data_used: 3350528
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:14.924320+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114745344 unmapped: 46358528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.845407486s of 10.143686295s, submitted: 89
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 134 ms_handle_reset con 0x564b59897000 session 0x564b583430e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:15.924468+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114745344 unmapped: 46358528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:16.924646+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114745344 unmapped: 46358528 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 135 ms_handle_reset con 0x564b598fe800 session 0x564b5aee3860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 135 heartbeat osd_stat(store_statfs(0x1b7bcf000/0x0/0x1bfc00000, data 0x39e0433/0x3abd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:17.924787+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114769920 unmapped: 46333952 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 135 ms_handle_reset con 0x564b598fe800 session 0x564b5a30de00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:18.924917+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114778112 unmapped: 46325760 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 ms_handle_reset con 0x564b582e0c00 session 0x564b5a30c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 heartbeat osd_stat(store_statfs(0x1b7bcc000/0x0/0x1bfc00000, data 0x39e27f1/0x3ac1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1344194 data_alloc: 184549376 data_used: 3354624
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 ms_handle_reset con 0x564b582e1400 session 0x564b5a30d4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:19.925045+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114860032 unmapped: 46243840 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 ms_handle_reset con 0x564b59897000 session 0x564b598b9c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:20.925227+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114876416 unmapped: 46227456 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:21.925377+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114876416 unmapped: 46227456 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 heartbeat osd_stat(store_statfs(0x1b7bcb000/0x0/0x1bfc00000, data 0x39e4b85/0x3ac3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 136 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:22.925740+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114917376 unmapped: 46186496 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 heartbeat osd_stat(store_statfs(0x1b7bc6000/0x0/0x1bfc00000, data 0x39e6e29/0x3ac7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:23.925904+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114917376 unmapped: 46186496 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1346698 data_alloc: 184549376 data_used: 3362816
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:24.926046+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114917376 unmapped: 46186496 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:25.926181+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114925568 unmapped: 46178304 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.555224419s of 10.838193893s, submitted: 93
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 ms_handle_reset con 0x564b59a85800 session 0x564b598b83c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 heartbeat osd_stat(store_statfs(0x1b7bc6000/0x0/0x1bfc00000, data 0x39e6e29/0x3ac7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 heartbeat osd_stat(store_statfs(0x1b7bc5000/0x0/0x1bfc00000, data 0x39e6e8b/0x3ac8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:26.926685+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114925568 unmapped: 46178304 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:27.926864+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114925568 unmapped: 46178304 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b7bbf000/0x0/0x1bfc00000, data 0x39e930d/0x3ace000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 ms_handle_reset con 0x564b59a85800 session 0x564b598b8960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:28.927013+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114941952 unmapped: 46161920 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 ms_handle_reset con 0x564b582e0c00 session 0x564b598b8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1355636 data_alloc: 184549376 data_used: 3379200
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 ms_handle_reset con 0x564b582e1400 session 0x564b5bc42780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:29.927205+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114958336 unmapped: 46145536 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 ms_handle_reset con 0x564b59897000 session 0x564b5bc421e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 ms_handle_reset con 0x564b598fe800 session 0x564b5774cb40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:30.927364+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114958336 unmapped: 46145536 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:31.927678+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114966528 unmapped: 46137344 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:32.927839+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b7bc0000/0x0/0x1bfc00000, data 0x39e92bb/0x3ace000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:33.928248+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b7bbb000/0x0/0x1bfc00000, data 0x39eb543/0x3ad2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1360793 data_alloc: 184549376 data_used: 3391488
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:34.928390+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b7bbb000/0x0/0x1bfc00000, data 0x39eb543/0x3ad2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:35.928581+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 heartbeat osd_stat(store_statfs(0x1b7bbb000/0x0/0x1bfc00000, data 0x39eb543/0x3ad2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:36.928944+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:37.929139+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:38.929321+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.438465118s of 12.903591156s, submitted: 47
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 ms_handle_reset con 0x564b598fe800 session 0x564b5774c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114982912 unmapped: 46120960 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1362269 data_alloc: 184549376 data_used: 3391488
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:39.929570+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114991104 unmapped: 46112768 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 ms_handle_reset con 0x564b582e0c00 session 0x564b5774dc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 ms_handle_reset con 0x564b582e1400 session 0x564b5833e780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:40.929698+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115064832 unmapped: 46039040 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 ms_handle_reset con 0x564b59897000 session 0x564b59aa6960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 ms_handle_reset con 0x564b59ff8000 session 0x564b5b1b54a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:41.929823+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115113984 unmapped: 45989888 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 ms_handle_reset con 0x564b582e0c00 session 0x564b5b1b5680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 heartbeat osd_stat(store_statfs(0x1b7bb5000/0x0/0x1bfc00000, data 0x39ed958/0x3ad8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b59a85800 session 0x564b59f8d2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:42.929967+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115154944 unmapped: 45948928 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b582e1400 session 0x564b5833e780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b59897000 session 0x564b5b6e01e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:43.930109+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114245632 unmapped: 46858240 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b598fe800 session 0x564b5bc42780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1374943 data_alloc: 184549376 data_used: 3416064
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:44.930249+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114253824 unmapped: 46850048 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b598fe800 session 0x564b59670b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 heartbeat osd_stat(store_statfs(0x1b7bb2000/0x0/0x1bfc00000, data 0x39efd4d/0x3adc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 ms_handle_reset con 0x564b582e0c00 session 0x564b5d73a000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 heartbeat osd_stat(store_statfs(0x1b7bb2000/0x0/0x1bfc00000, data 0x39efd4d/0x3adc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:45.930416+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114319360 unmapped: 46784512 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 heartbeat osd_stat(store_statfs(0x1b7bb4000/0x0/0x1bfc00000, data 0x39efcdb/0x3ada000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:46.930603+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114319360 unmapped: 46784512 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 heartbeat osd_stat(store_statfs(0x1b7bb4000/0x0/0x1bfc00000, data 0x39efcdb/0x3ada000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:47.930748+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114327552 unmapped: 46776320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:48.930978+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114327552 unmapped: 46776320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1372957 data_alloc: 184549376 data_used: 3416064
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:49.931155+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114327552 unmapped: 46776320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:50.931327+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114327552 unmapped: 46776320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:51.931557+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114327552 unmapped: 46776320 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.876964569s of 13.404407501s, submitted: 131
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bb4000/0x0/0x1bfc00000, data 0x39efcdb/0x3ada000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:52.931695+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:53.931836+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1377159 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7baf000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:54.931969+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:55.932128+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:56.932306+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:57.933063+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:58.933524+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1377159 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:59.934121+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7baf000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:00.934709+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:01.935254+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:02.935724+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114335744 unmapped: 46768128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:03.935860+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.577858925s of 11.593151093s, submitted: 21
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b5d73b860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114343936 unmapped: 46759936 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1379581 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:04.936303+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114343936 unmapped: 46759936 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7baf000/0x0/0x1bfc00000, data 0x39f1f73/0x3adf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:05.936703+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114343936 unmapped: 46759936 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b5d73ba40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59a85800 session 0x564b5d73be00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:06.937009+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:07.937343+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:08.937457+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1378116 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:09.937611+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bb0000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:10.937765+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:11.937924+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bb0000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:12.938072+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:13.938218+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1378116 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:14.938390+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bb0000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:15.938533+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:16.938726+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114294784 unmapped: 46809088 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 13.525009155s of 13.629662514s, submitted: 24
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b57c7a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bb0000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b5bc423c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:17.938902+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114311168 unmapped: 46792704 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b598fe800 session 0x564b57a8ba40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b5c448f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b57c7b860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:18.939167+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114352128 unmapped: 46751744 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b5965ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1380673 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:19.939300+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b584934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114368512 unmapped: 46735360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:20.939453+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b57a90960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114368512 unmapped: 46735360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7baf000/0x0/0x1bfc00000, data 0x39f1f63/0x3ade000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:21.939607+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114368512 unmapped: 46735360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b5d73ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b598fe800 session 0x564b59590b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b598fe800 session 0x564b57a8a1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:22.939754+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b58347860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114368512 unmapped: 46735360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:23.947407+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b5833c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114376704 unmapped: 46727168 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:24.947541+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1389524 data_alloc: 184549376 data_used: 3428352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b57c82d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b57c825a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b7bac000/0x0/0x1bfc00000, data 0x39f2099/0x3ae2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 114393088 unmapped: 46710784 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b57c814a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b598b81e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b5b06c780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b598fe800 session 0x564b59590000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:25.947679+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119054336 unmapped: 42049536 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b59f125a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897c00 session 0x564b5b1b4780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b5bc434a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b59ff7e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b5833f4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:26.948052+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115720192 unmapped: 45383680 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:27.948192+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b583421e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.442699432s of 10.391208649s, submitted: 220
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115720192 unmapped: 45383680 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897000 session 0x564b5b06c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e0c00 session 0x564b583434a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 heartbeat osd_stat(store_statfs(0x1b6648000/0x0/0x1bfc00000, data 0x4f56038/0x5045000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:28.948321+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115761152 unmapped: 45342720 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b582e1400 session 0x564b58492d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b59897c00 session 0x564b595e8d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:29.948463+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1499572 data_alloc: 184549376 data_used: 3432448
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115785728 unmapped: 45318144 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b5965ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 ms_handle_reset con 0x564b5b349400 session 0x564b57c7a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:30.948604+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115834880 unmapped: 45268992 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:31.948770+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 143 ms_handle_reset con 0x564b59897000 session 0x564b5c448f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 143 heartbeat osd_stat(store_statfs(0x1b6ac0000/0x0/0x1bfc00000, data 0x45bffd6/0x46ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115859456 unmapped: 45244416 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 144 ms_handle_reset con 0x564b582e1400 session 0x564b57c7b860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:32.948926+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 115859456 unmapped: 45244416 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 ms_handle_reset con 0x564b59897c00 session 0x564b5bc425a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57531400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fde000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:33.949055+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 ms_handle_reset con 0x564b57531400 session 0x564b5bc423c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 ms_handle_reset con 0x564b598fe800 session 0x564b5bc42780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116916224 unmapped: 44187648 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b59fde000 session 0x564b5dd36000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b582e0c00 session 0x564b57c7b680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:34.949189+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1651314 data_alloc: 184549376 data_used: 3444736
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b582e1400 session 0x564b598b94a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116908032 unmapped: 44195840 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:35.949323+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2814 syncs, 3.67 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4763 writes, 16K keys, 4763 commit groups, 1.0 writes per commit group, ingest: 13.99 MB, 0.02 MB/s
                                                          Interval WAL: 4763 writes, 2036 syncs, 2.34 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 116998144 unmapped: 44105728 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:36.949482+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b59897c00 session 0x564b5d73b860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fde000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b598fe800 session 0x564b5dd36780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118128640 unmapped: 42975232 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b59fde000 session 0x564b5de921e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 ms_handle_reset con 0x564b582e0c00 session 0x564b5d73ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:37.949613+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 heartbeat osd_stat(store_statfs(0x1b5a8d000/0x0/0x1bfc00000, data 0x5703704/0x5801000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 146 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.292324066s of 10.151702881s, submitted: 196
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b5b349400 session 0x564b5de92960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117571584 unmapped: 43532288 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b582e1400 session 0x564b5bc423c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b582e1400 session 0x564b57c810e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b582e1800 session 0x564b5de92d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b59897000 session 0x564b59ff6f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:38.949742+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b582e0c00 session 0x564b5b6e05a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117596160 unmapped: 43507712 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fde000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 ms_handle_reset con 0x564b598fe800 session 0x564b595e8d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:39.949896+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1745914 data_alloc: 184549376 data_used: 3461120
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 ms_handle_reset con 0x564b59fde000 session 0x564b5de930e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117628928 unmapped: 43474944 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 ms_handle_reset con 0x564b582e0c00 session 0x564b5de934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 ms_handle_reset con 0x564b598fe800 session 0x564b58492d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:40.950029+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117653504 unmapped: 43450368 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 ms_handle_reset con 0x564b582e1800 session 0x564b5de93860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:41.950164+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 ms_handle_reset con 0x564b59897000 session 0x564b5de93e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117243904 unmapped: 43859968 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 149 ms_handle_reset con 0x564b582e1400 session 0x564b59590960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:42.950297+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 117243904 unmapped: 43859968 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:43.950418+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 149 heartbeat osd_stat(store_statfs(0x1b6bc2000/0x0/0x1bfc00000, data 0x45cfb66/0x46cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 ms_handle_reset con 0x564b582e0c00 session 0x564b595e9e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118317056 unmapped: 42786816 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 ms_handle_reset con 0x564b582e1800 session 0x564b5e30ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:44.950542+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1466590 data_alloc: 184549376 data_used: 3477504
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 ms_handle_reset con 0x564b59897000 session 0x564b5b06c780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 ms_handle_reset con 0x564b598fe800 session 0x564b5de93680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118333440 unmapped: 42770432 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:45.950678+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 ms_handle_reset con 0x564b598fe800 session 0x564b5833b4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118333440 unmapped: 42770432 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:46.950852+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118341632 unmapped: 42762240 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e0c00 session 0x564b5833be00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1400 session 0x564b59f8d860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:47.951005+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118341632 unmapped: 42762240 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:48.951142+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.924703598s of 10.916292191s, submitted: 288
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1800 session 0x564b59ff70e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118349824 unmapped: 42754048 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:49.951275+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 heartbeat osd_stat(store_statfs(0x1b7783000/0x0/0x1bfc00000, data 0x3a0638e/0x3b0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1487848 data_alloc: 184549376 data_used: 3489792
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b59897000 session 0x564b58344d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1400 session 0x564b5bc98000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118366208 unmapped: 42737664 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b598fe800 session 0x564b57a8c000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fde000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e0c00 session 0x564b5833be00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1800 session 0x564b5bc981e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b59fde000 session 0x564b5d73a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e0c00 session 0x564b5de92960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:50.951415+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118366208 unmapped: 42737664 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1400 session 0x564b5de92d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b582e1800 session 0x564b5bc98780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:51.969440+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b349400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 ms_handle_reset con 0x564b598fe800 session 0x564b5de934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 151 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b5b349400 session 0x564b5bc98d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 118464512 unmapped: 42639360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b582e0c00 session 0x564b5b6e05a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b582e1400 session 0x564b5bc98f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b582e1800 session 0x564b57c810e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff5400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:52.969545+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b598fe800 session 0x564b5bc423c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b59ff5400 session 0x564b5bc990e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 ms_handle_reset con 0x564b582e0c00 session 0x564b5d73b860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:53.969697+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b741a000/0x0/0x1bfc00000, data 0x3d745e7/0x3e74000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b582e1400 session 0x564b5d73ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:54.969889+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1519926 data_alloc: 184549376 data_used: 3514368
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b582e1800 session 0x564b5dd36000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:55.970052+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b598fe800 session 0x564b598b94a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b7415000/0x0/0x1bfc00000, data 0x3d769dd/0x3e78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:56.970228+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b7415000/0x0/0x1bfc00000, data 0x3d769dd/0x3e78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:57.970387+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b57d3f400 session 0x564b57c7b680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:58.970558+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b57d3f400 session 0x564b5b6e12c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119283712 unmapped: 41820160 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.927986145s of 10.787474632s, submitted: 231
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b582e0c00 session 0x564b5b1b4780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b582e1400 session 0x564b5b06c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:59.970666+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b582e1800 session 0x564b5de925a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1521115 data_alloc: 184549376 data_used: 3514368
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119291904 unmapped: 41811968 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:00.970826+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119291904 unmapped: 41811968 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b598fe800 session 0x564b5de93c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 ms_handle_reset con 0x564b598fe800 session 0x564b5b6e01e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:01.970988+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 154 ms_handle_reset con 0x564b582e0c00 session 0x564b5bc99680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 154 heartbeat osd_stat(store_statfs(0x1b7414000/0x0/0x1bfc00000, data 0x3d76a3f/0x3e79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119341056 unmapped: 41762816 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:02.971148+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119709696 unmapped: 41394176 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:03.971582+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 154 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 ms_handle_reset con 0x564b582e1400 session 0x564b5bc99860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119717888 unmapped: 41385984 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 ms_handle_reset con 0x564b582e1800 session 0x564b5bc99c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:04.972200+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1553322 data_alloc: 184549376 data_used: 6696960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58482c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 ms_handle_reset con 0x564b58482c00 session 0x564b5bc99e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119734272 unmapped: 41369600 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:05.974142+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 heartbeat osd_stat(store_statfs(0x1b7409000/0x0/0x1bfc00000, data 0x3d7b13d/0x3e84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 156 ms_handle_reset con 0x564b58371000 session 0x564b5bc98780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119742464 unmapped: 41361408 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:06.974298+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 156 ms_handle_reset con 0x564b582e1400 session 0x564b5bc98f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 157 ms_handle_reset con 0x564b582e0c00 session 0x564b59ff70e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119767040 unmapped: 41336832 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 157 ms_handle_reset con 0x564b582e1800 session 0x564b59ff7e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fe800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:07.974447+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 ms_handle_reset con 0x564b598fe800 session 0x564b59ff7860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119939072 unmapped: 41164800 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 ms_handle_reset con 0x564b582e0c00 session 0x564b5d7bc000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 ms_handle_reset con 0x564b582e1400 session 0x564b5d7bc5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:08.974614+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 ms_handle_reset con 0x564b58371000 session 0x564b5bc992c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 119947264 unmapped: 41156608 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 ms_handle_reset con 0x564b5a304c00 session 0x564b5d7bc780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:09.974773+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1907605 data_alloc: 184549376 data_used: 6713344
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.535307884s of 10.371746063s, submitted: 176
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588dbc00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 ms_handle_reset con 0x564b588dbc00 session 0x564b5dd36780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 128409600 unmapped: 32694272 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 ms_handle_reset con 0x564b582e1800 session 0x564b5de92960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 ms_handle_reset con 0x564b582e1400 session 0x564b5de934a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:10.975012+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 ms_handle_reset con 0x564b582e0c00 session 0x564b5dd36f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 heartbeat osd_stat(store_statfs(0x1b43f6000/0x0/0x1bfc00000, data 0x6d8421b/0x6e97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 128671744 unmapped: 32432128 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:11.975159+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 160 ms_handle_reset con 0x564b58371000 session 0x564b5de925a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 128884736 unmapped: 32219136 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:12.975406+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b22fc00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 160 ms_handle_reset con 0x564b5a304c00 session 0x564b5dd370e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 160 ms_handle_reset con 0x564b5b22fc00 session 0x564b5d73a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 121962496 unmapped: 39141376 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 160 ms_handle_reset con 0x564b582e0c00 session 0x564b5d73ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:13.975531+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 ms_handle_reset con 0x564b58371000 session 0x564b5833be00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 ms_handle_reset con 0x564b582e1800 session 0x564b5d7bcd20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 ms_handle_reset con 0x564b5a304c00 session 0x564b5bc42f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 122912768 unmapped: 38191104 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 ms_handle_reset con 0x564b582e1400 session 0x564b5bc98d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:14.975711+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2673296 data_alloc: 184549376 data_used: 6803456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 36970496 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:15.975924+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 heartbeat osd_stat(store_statfs(0x1acaa5000/0x0/0x1bfc00000, data 0xe6c8bcc/0xe7e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 124346368 unmapped: 36757504 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:16.976109+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 162 ms_handle_reset con 0x564b582e1800 session 0x564b5dd37a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 162 ms_handle_reset con 0x564b582e0c00 session 0x564b59f8de00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b22fc00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 handle_osd_map epochs [162,163], i have 163, src has [1,163]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b58371000 session 0x564b5d7bd2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b5b22fc00 session 0x564b583421e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a6000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 124526592 unmapped: 36577280 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b582e0c00 session 0x564b5e30ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:17.976252+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 heartbeat osd_stat(store_statfs(0x1ab8fa000/0x0/0x1bfc00000, data 0xe6cd2a2/0xe7ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b582e1400 session 0x564b5de92f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b58371000 session 0x564b5c1a61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a305400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 124084224 unmapped: 37019648 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b5a305400 session 0x564b5c1a63c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b582e1800 session 0x564b5aee25a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a6960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:18.976395+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b582e0c00 session 0x564b5c449e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126386176 unmapped: 34717696 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b582e1400 session 0x564b57a901e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:19.976536+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3193916 data_alloc: 184549376 data_used: 6819840
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.588166714s of 10.029601097s, submitted: 389
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126656512 unmapped: 34447360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58371000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 ms_handle_reset con 0x564b58371000 session 0x564b5b06c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:20.976721+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126656512 unmapped: 34447360 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:21.976914+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 heartbeat osd_stat(store_statfs(0x1a78e2000/0x0/0x1bfc00000, data 0x126ef2a9/0x1280c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 163 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 126820352 unmapped: 34283520 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:22.977043+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 ms_handle_reset con 0x564b582e0c00 session 0x564b57a905a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 ms_handle_reset con 0x564b57d3ec00 session 0x564b5833e000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 135282688 unmapped: 25821184 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 heartbeat osd_stat(store_statfs(0x1a58da000/0x0/0x1bfc00000, data 0x146f1797/0x14813000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:23.977173+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588da800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 ms_handle_reset con 0x564b582e1800 session 0x564b5c1a72c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a305000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 ms_handle_reset con 0x564b588da800 session 0x564b5dd37680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127082496 unmapped: 34021376 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:24.977277+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3734882 data_alloc: 184549376 data_used: 6836224
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 ms_handle_reset con 0x564b5a305000 session 0x564b5c1a7680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 ms_handle_reset con 0x564b59ff9800 session 0x564b5dd363c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127246336 unmapped: 33857536 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 ms_handle_reset con 0x564b582e0c00 session 0x564b5aee23c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:25.977380+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588da800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 ms_handle_reset con 0x564b588da800 session 0x564b5c4492c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 165 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2ce400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 166 ms_handle_reset con 0x564b582e1800 session 0x564b5b1b5860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b22f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127549440 unmapped: 33554432 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 166 ms_handle_reset con 0x564b5a2ce400 session 0x564b5bc98960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 166 ms_handle_reset con 0x564b5b22f400 session 0x564b5de921e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:26.977514+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 166 heartbeat osd_stat(store_statfs(0x1a0cd2000/0x0/0x1bfc00000, data 0x18efae27/0x1901c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136003584 unmapped: 25100288 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a7a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:27.977680+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 ms_handle_reset con 0x564b582e0c00 session 0x564b5b1b54a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127778816 unmapped: 33325056 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:28.977834+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127787008 unmapped: 33316864 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588da800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:29.977956+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 ms_handle_reset con 0x564b582e1800 session 0x564b5de925a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.476424694s of 10.002325058s, submitted: 227
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4127773 data_alloc: 184549376 data_used: 6856704
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 127787008 unmapped: 33316864 heap: 161103872 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 heartbeat osd_stat(store_statfs(0x19f4cc000/0x0/0x1bfc00000, data 0x1a6fd6e4/0x1a820000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:30.978126+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 128983040 unmapped: 40517632 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 168 ms_handle_reset con 0x564b59ff9800 session 0x564b5d73a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:31.978274+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 handle_osd_map epochs [168,169], i have 169, src has [1,169]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b6e01e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 ms_handle_reset con 0x564b588da800 session 0x564b5de92000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129187840 unmapped: 40312832 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:32.978397+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 ms_handle_reset con 0x564b582e0c00 session 0x564b5b6e0780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129335296 unmapped: 40165376 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:33.978578+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129490944 unmapped: 40009728 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 ms_handle_reset con 0x564b582e1800 session 0x564b5d7bc000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:34.978716+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4635084 data_alloc: 184549376 data_used: 6893568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 46
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129835008 unmapped: 39665664 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:35.979141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 heartbeat osd_stat(store_statfs(0x19b4be000/0x0/0x1bfc00000, data 0x1e707272/0x1e82f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b22f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a0400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 ms_handle_reset con 0x564b595a0400 session 0x564b5dd37c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129998848 unmapped: 39501824 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:36.979370+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 171 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b6e12c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138551296 unmapped: 30949376 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:37.979557+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 172 ms_handle_reset con 0x564b582e1800 session 0x564b5de92b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 172 ms_handle_reset con 0x564b582e0c00 session 0x564b5de92960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138772480 unmapped: 30728192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 172 ms_handle_reset con 0x564b5b22f400 session 0x564b5d7bc780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:38.979692+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588da800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 172 ms_handle_reset con 0x564b588da800 session 0x564b5c4492c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130392064 unmapped: 39108608 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:39.979846+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5136328 data_alloc: 184549376 data_used: 6905856
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.359140396s of 10.353400230s, submitted: 134
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 139862016 unmapped: 29638656 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 173 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b1b5860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:40.979988+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 173 ms_handle_reset con 0x564b582e1800 session 0x564b5d73b0e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b22f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 173 ms_handle_reset con 0x564b5a304000 session 0x564b5c449a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 47
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 140378112 unmapped: 29122560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 174 ms_handle_reset con 0x564b582e0c00 session 0x564b5b1b54a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:41.980141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 174 heartbeat osd_stat(store_statfs(0x196492000/0x0/0x1bfc00000, data 0x2470df52/0x2483b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 174 ms_handle_reset con 0x564b5b22f400 session 0x564b59f8c3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132128768 unmapped: 37371904 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:42.980338+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 140525568 unmapped: 28975104 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:43.980493+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 175 ms_handle_reset con 0x564b582e0c00 session 0x564b5d73a5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 175 ms_handle_reset con 0x564b582e1800 session 0x564b5d7bc1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132382720 unmapped: 37117952 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:44.980639+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 176 ms_handle_reset con 0x564b5a304000 session 0x564b5bc43a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5700270 data_alloc: 184549376 data_used: 6934528
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 140959744 unmapped: 28540928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:45.980792+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 ms_handle_reset con 0x564b595a1000 session 0x564b5c448b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b58483000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 ms_handle_reset con 0x564b58483000 session 0x564b59ff65a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 ms_handle_reset con 0x564b582e1400 session 0x564b59591860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132718592 unmapped: 36782080 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 ms_handle_reset con 0x564b582e0c00 session 0x564b5aee3680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:46.981007+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 178 ms_handle_reset con 0x564b57d3f400 session 0x564b5b6e0b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132784128 unmapped: 36716544 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 178 heartbeat osd_stat(store_statfs(0x190c82000/0x0/0x1bfc00000, data 0x29f16a42/0x2a04a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:47.981126+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 178 ms_handle_reset con 0x564b595a1000 session 0x564b57a8b0e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130809856 unmapped: 38690816 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 178 ms_handle_reset con 0x564b582e1800 session 0x564b5b6e1860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:48.981260+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 ms_handle_reset con 0x564b5b152400 session 0x564b5d7bc960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130818048 unmapped: 38682624 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:49.981400+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 ms_handle_reset con 0x564b57d3ec00 session 0x564b59f134a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5831764 data_alloc: 184549376 data_used: 3686400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 ms_handle_reset con 0x564b57d3f400 session 0x564b5a30d4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 ms_handle_reset con 0x564b582e0c00 session 0x564b57c83e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130850816 unmapped: 38649856 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.127758026s of 10.215961456s, submitted: 203
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:50.981540+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b582e1400 session 0x564b5b06cb40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b57d3f400 session 0x564b59a79a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b595a1000 session 0x564b5b06c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129712128 unmapped: 39788544 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b582e0c00 session 0x564b5bc99860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b57d3ec00 session 0x564b5bc99c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:51.981678+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 heartbeat osd_stat(store_statfs(0x190ef1000/0x0/0x1bfc00000, data 0x29ca4170/0x29ddd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 ms_handle_reset con 0x564b57d3ec00 session 0x564b57c810e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 ms_handle_reset con 0x564b57d3f400 session 0x564b595914a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129425408 unmapped: 40075264 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 ms_handle_reset con 0x564b582e1400 session 0x564b5de93c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:52.981819+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 heartbeat osd_stat(store_statfs(0x1b4eeb000/0x0/0x1bfc00000, data 0x4ca6704/0x4de1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 182 ms_handle_reset con 0x564b595a1000 session 0x564b59670b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 182 ms_handle_reset con 0x564b582e1800 session 0x564b5de923c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129499136 unmapped: 40001536 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:53.981927+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 183 ms_handle_reset con 0x564b582e1800 session 0x564b59f13680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129507328 unmapped: 39993344 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:54.982055+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1886828 data_alloc: 184549376 data_used: 3698688
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b582e0c00 session 0x564b5d7bc5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b57d3f400 session 0x564b5e30a5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b57d3ec00 session 0x564b57c814a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129597440 unmapped: 39903232 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:55.982192+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b582e1400 session 0x564b5e30ab40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b57d3ec00 session 0x564b5d73bc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 ms_handle_reset con 0x564b57d3f400 session 0x564b5774f4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 ms_handle_reset con 0x564b595a1000 session 0x564b57c82d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129662976 unmapped: 39837696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:56.982348+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 ms_handle_reset con 0x564b582e0c00 session 0x564b5b06da40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 heartbeat osd_stat(store_statfs(0x1b5ede000/0x0/0x1bfc00000, data 0x4cb1c6d/0x4df0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 heartbeat osd_stat(store_statfs(0x1b5ede000/0x0/0x1bfc00000, data 0x4cb1c6d/0x4df0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129695744 unmapped: 39804928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:57.982512+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 ms_handle_reset con 0x564b582e1800 session 0x564b5b6e0f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129695744 unmapped: 39804928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:58.982659+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129695744 unmapped: 39804928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:59.982780+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1889739 data_alloc: 184549376 data_used: 3694592
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 129695744 unmapped: 39804928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.596391678s of 10.004580498s, submitted: 394
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 ms_handle_reset con 0x564b57d3ec00 session 0x564b5a30d4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:00.983954+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130752512 unmapped: 38748160 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 186 ms_handle_reset con 0x564b57d3f400 session 0x564b5aee3860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:01.984085+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 186 ms_handle_reset con 0x564b595a1000 session 0x564b5a30cd20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 ms_handle_reset con 0x564b582e0c00 session 0x564b5c4490e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 heartbeat osd_stat(store_statfs(0x1b5ed3000/0x0/0x1bfc00000, data 0x4cb4253/0x4dfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 ms_handle_reset con 0x564b582e1800 session 0x564b5d7bc3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 ms_handle_reset con 0x564b5b152400 session 0x564b5b06c3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130834432 unmapped: 38666240 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:02.984219+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 ms_handle_reset con 0x564b57d3ec00 session 0x564b5833eb40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130859008 unmapped: 38641664 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:03.984358+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b582e1800 session 0x564b5d73af00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 heartbeat osd_stat(store_statfs(0x1b5ecd000/0x0/0x1bfc00000, data 0x4cb8eae/0x4e00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130826240 unmapped: 38674432 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:04.984510+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1904814 data_alloc: 184549376 data_used: 3719168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b57d3f400 session 0x564b59f8d860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130850816 unmapped: 38649856 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b582e0c00 session 0x564b5aee34a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:05.984648+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b582e0c00 session 0x564b5de930e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b57d3ec00 session 0x564b5833ef00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130932736 unmapped: 38567936 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:06.984806+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 heartbeat osd_stat(store_statfs(0x1b5ecf000/0x0/0x1bfc00000, data 0x4cb8935/0x4dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130932736 unmapped: 38567936 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:07.984959+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130932736 unmapped: 38567936 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:08.985112+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130932736 unmapped: 38567936 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:09.985272+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b57d3f400 session 0x564b585b94a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2004460 data_alloc: 184549376 data_used: 3719168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b5b152400 session 0x564b5833a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 ms_handle_reset con 0x564b582e1800 session 0x564b58345e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130957312 unmapped: 38543360 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:10.985409+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.758655548s of 10.441881180s, submitted: 182
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 ms_handle_reset con 0x564b595a1000 session 0x564b5d7bcf00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 ms_handle_reset con 0x564b57d3ec00 session 0x564b583454a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 ms_handle_reset con 0x564b57d3f400 session 0x564b5774dc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 ms_handle_reset con 0x564b582e1800 session 0x564b596743c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 130965504 unmapped: 38535168 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:11.985513+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 ms_handle_reset con 0x564b582e0c00 session 0x564b5774cd20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 ms_handle_reset con 0x564b5b152400 session 0x564b5833a1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131031040 unmapped: 38469632 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 heartbeat osd_stat(store_statfs(0x1b52ab000/0x0/0x1bfc00000, data 0x58d7ee6/0x5a23000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:12.985635+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a74a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 heartbeat osd_stat(store_statfs(0x1b57c4000/0x0/0x1bfc00000, data 0x4cbd241/0x4e08000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 ms_handle_reset con 0x564b57d3f400 session 0x564b598b9c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131063808 unmapped: 38436864 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:13.985778+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b582e1800 session 0x564b5bc983c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b595a1000 session 0x564b596750e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b582e0c00 session 0x564b5833e3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131129344 unmapped: 38371328 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:14.985903+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b595a1000 session 0x564b5774f2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1928998 data_alloc: 184549376 data_used: 3743744
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c448780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131145728 unmapped: 38354944 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 ms_handle_reset con 0x564b5b152400 session 0x564b5e30a1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:15.986022+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 heartbeat osd_stat(store_statfs(0x1b5ec2000/0x0/0x1bfc00000, data 0x4cbf61b/0x4e0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 192 ms_handle_reset con 0x564b582e1800 session 0x564b58493c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 192 ms_handle_reset con 0x564b57d3f400 session 0x564b5b6e1680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131153920 unmapped: 38346752 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:16.986189+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131153920 unmapped: 38346752 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:17.986302+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 ms_handle_reset con 0x564b582e1800 session 0x564b57c83680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 ms_handle_reset con 0x564b57d3ec00 session 0x564b57c83c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131203072 unmapped: 38297600 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:18.986425+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 ms_handle_reset con 0x564b595a1000 session 0x564b5bc43e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a304000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 ms_handle_reset con 0x564b5a304000 session 0x564b5b6e0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 193 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 ms_handle_reset con 0x564b5b152400 session 0x564b5bc42000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 ms_handle_reset con 0x564b582e0c00 session 0x564b59ff74a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131219456 unmapped: 38281216 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:19.986566+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b06d680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 ms_handle_reset con 0x564b57d3f400 session 0x564b57c7a3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1956371 data_alloc: 184549376 data_used: 3756032
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598ff400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 ms_handle_reset con 0x564b598ff400 session 0x564b57c825a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 ms_handle_reset con 0x564b582e1800 session 0x564b5b6e0f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 ms_handle_reset con 0x564b595a1000 session 0x564b59675e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131293184 unmapped: 38207488 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:20.986701+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b1b41e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 ms_handle_reset con 0x564b57d3f400 session 0x564b5aee30e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.759701729s of 10.003255844s, submitted: 313
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 195 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131383296 unmapped: 38117376 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 ms_handle_reset con 0x564b582e0c00 session 0x564b584930e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:21.986906+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 ms_handle_reset con 0x564b598fb000 session 0x564b59aa61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 ms_handle_reset con 0x564b5b152400 session 0x564b5aee21e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 ms_handle_reset con 0x564b59a85c00 session 0x564b57c82d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 heartbeat osd_stat(store_statfs(0x1b5aab000/0x0/0x1bfc00000, data 0x4ccaa5c/0x4e22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 ms_handle_reset con 0x564b57d3ec00 session 0x564b57c7ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 heartbeat osd_stat(store_statfs(0x1b5aab000/0x0/0x1bfc00000, data 0x4ccaa5c/0x4e22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 196 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 197 ms_handle_reset con 0x564b57d3f400 session 0x564b57a8bc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132489216 unmapped: 37011456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:22.987028+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 197 ms_handle_reset con 0x564b598fb000 session 0x564b59a790e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 197 ms_handle_reset con 0x564b59a85c00 session 0x564b57a8a780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132505600 unmapped: 36995072 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 ms_handle_reset con 0x564b57d3f400 session 0x564b5d73b2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 ms_handle_reset con 0x564b57d3ec00 session 0x564b595e9c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:23.987188+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132521984 unmapped: 36978688 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 ms_handle_reset con 0x564b5b152400 session 0x564b596752c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:24.987319+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 ms_handle_reset con 0x564b57d3ec00 session 0x564b5bc98960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1961815 data_alloc: 184549376 data_used: 3772416
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 ms_handle_reset con 0x564b57d3f400 session 0x564b5bc98000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132538368 unmapped: 36962304 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 ms_handle_reset con 0x564b598fb000 session 0x564b5bc99a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:25.987435+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 ms_handle_reset con 0x564b5b152400 session 0x564b5aee25a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 ms_handle_reset con 0x564b59a85c00 session 0x564b5aee3c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 ms_handle_reset con 0x564b59a85c00 session 0x564b57c7b680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132587520 unmapped: 36913152 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:26.987588+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 ms_handle_reset con 0x564b57d3f400 session 0x564b5bc98b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132603904 unmapped: 36896768 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:27.987717+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 200 heartbeat osd_stat(store_statfs(0x1b5aa1000/0x0/0x1bfc00000, data 0x4cd1585/0x4e2c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 200 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b06c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 200 ms_handle_reset con 0x564b598fb000 session 0x564b5bc985a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b152400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 200 ms_handle_reset con 0x564b5b152400 session 0x564b5b06c780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132628480 unmapped: 36872192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:28.987945+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 ms_handle_reset con 0x564b57d3f400 session 0x564b5b1b54a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 ms_handle_reset con 0x564b598fb000 session 0x564b5b06c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b06cb40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 ms_handle_reset con 0x564b582e0c00 session 0x564b5b06d4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 ms_handle_reset con 0x564b595a1000 session 0x564b58346d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 133423104 unmapped: 36077568 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:29.988137+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b59a85c00 session 0x564b5b1b4f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b582e1800 session 0x564b5bc43a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b57d3ec00 session 0x564b5833e000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b595a1000 session 0x564b584925a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2101536 data_alloc: 184549376 data_used: 3809280
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132161536 unmapped: 37339136 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:30.988270+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b57d3f400 session 0x564b5c1a7a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.844838142s of 10.049607277s, submitted: 359
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b57d3ec00 session 0x564b5833fa40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 ms_handle_reset con 0x564b57d3f400 session 0x564b5c1a7680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132161536 unmapped: 37339136 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:31.988385+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 ms_handle_reset con 0x564b582e1800 session 0x564b5c1a6960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 ms_handle_reset con 0x564b595a1000 session 0x564b5c1a63c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132161536 unmapped: 37339136 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:32.988524+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 heartbeat osd_stat(store_statfs(0x1b4b57000/0x0/0x1bfc00000, data 0x5c14655/0x5d73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59a85c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 ms_handle_reset con 0x564b59a85c00 session 0x564b5c448000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132169728 unmapped: 37330944 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:33.988671+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 ms_handle_reset con 0x564b57d3ec00 session 0x564b5d73a780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 heartbeat osd_stat(store_statfs(0x1b4b5b000/0x0/0x1bfc00000, data 0x5c14655/0x5d73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132186112 unmapped: 37314560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:34.988801+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1990980 data_alloc: 184549376 data_used: 3854336
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132186112 unmapped: 37314560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:35.988934+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132186112 unmapped: 37314560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:36.989131+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131604480 unmapped: 37896192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:37.989253+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131604480 unmapped: 37896192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:38.989355+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 205 heartbeat osd_stat(store_statfs(0x1b5a92000/0x0/0x1bfc00000, data 0x4cdc854/0x4e3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3f400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:39.989469+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131604480 unmapped: 37896192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1999428 data_alloc: 184549376 data_used: 3874816
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 205 heartbeat osd_stat(store_statfs(0x1b5a8e000/0x0/0x1bfc00000, data 0x4cded66/0x4e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:40.989593+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131604480 unmapped: 37896192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:41.989727+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 131604480 unmapped: 37896192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.327302933s of 11.611303329s, submitted: 140
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:42.990594+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:43.990774+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 heartbeat osd_stat(store_statfs(0x1b5a89000/0x0/0x1bfc00000, data 0x4ce105a/0x4e44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:44.990937+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2003454 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:45.991093+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 heartbeat osd_stat(store_statfs(0x1b5a89000/0x0/0x1bfc00000, data 0x4ce105a/0x4e44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:46.991317+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 heartbeat osd_stat(store_statfs(0x1b5a89000/0x0/0x1bfc00000, data 0x4ce105a/0x4e44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 206 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:47.991479+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:48.991639+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:49.991770+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2005576 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:50.991946+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:51.992127+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:52.992309+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:53.992459+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:54.992595+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2005576 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:55.992788+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:56.992947+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:57.993024+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:58.993190+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:59.993350+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2005576 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:00.993509+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:01.993673+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:02.993904+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:03.994034+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:04.994221+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a86000/0x0/0x1bfc00000, data 0x4ce3302/0x4e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2005576 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:05.994381+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:06.994573+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132653056 unmapped: 36847616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 24.419725418s of 24.487459183s, submitted: 41
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:07.994749+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132661248 unmapped: 36839424 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a85000/0x0/0x1bfc00000, data 0x4ce335f/0x4e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:08.994922+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132661248 unmapped: 36839424 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:09.995137+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132661248 unmapped: 36839424 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2007344 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a85000/0x0/0x1bfc00000, data 0x4ce335f/0x4e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:10.995297+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132661248 unmapped: 36839424 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:11.995488+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132448256 unmapped: 37052416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:12.995647+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132448256 unmapped: 37052416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a84000/0x0/0x1bfc00000, data 0x4ce33fa/0x4e4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:13.995776+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132448256 unmapped: 37052416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:14.995991+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132448256 unmapped: 37052416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2008936 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:15.996160+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132456448 unmapped: 37044224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:16.996370+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132456448 unmapped: 37044224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.039158821s of 10.064663887s, submitted: 4
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:17.996552+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132456448 unmapped: 37044224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:18.996732+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132456448 unmapped: 37044224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a84000/0x0/0x1bfc00000, data 0x4ce33fa/0x4e4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 ms_handle_reset con 0x564b582e0c00 session 0x564b59aa65a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:19.996902+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132472832 unmapped: 37027840 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 ms_handle_reset con 0x564b598fb000 session 0x564b59aa7680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2012325 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 ms_handle_reset con 0x564b59ff9000 session 0x564b59aa6b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:20.997027+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132481024 unmapped: 37019648 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:21.997260+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132481024 unmapped: 37019648 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:22.997537+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132481024 unmapped: 37019648 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b5a82000/0x0/0x1bfc00000, data 0x4ce3495/0x4e4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:23.997709+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132489216 unmapped: 37011456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:24.997853+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132489216 unmapped: 37011456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2008760 data_alloc: 184549376 data_used: 3887104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:25.998043+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 208 heartbeat osd_stat(store_statfs(0x1b5a84000/0x0/0x1bfc00000, data 0x4ce33fa/0x4e4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132497408 unmapped: 37003264 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:26.998240+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132497408 unmapped: 37003264 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:27.998395+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.253166199s of 10.466239929s, submitted: 83
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132497408 unmapped: 37003264 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:28.998565+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132521984 unmapped: 36978688 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 208 heartbeat osd_stat(store_statfs(0x1b5a7f000/0x0/0x1bfc00000, data 0x4ce57d8/0x4e4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:29.998692+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132530176 unmapped: 36970496 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c1800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2016466 data_alloc: 184549376 data_used: 3915776
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:30.998837+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132538368 unmapped: 36962304 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:31.998968+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132587520 unmapped: 36913152 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59897800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 ms_handle_reset con 0x564b59897800 session 0x564b5d7bd680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:32.999103+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132612096 unmapped: 36888576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:33.999247+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132612096 unmapped: 36888576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 ms_handle_reset con 0x564b57d3ec00 session 0x564b59670b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 ms_handle_reset con 0x564b582e0c00 session 0x564b5965ad20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:34.999376+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132685824 unmapped: 36814848 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 heartbeat osd_stat(store_statfs(0x1b5a70000/0x0/0x1bfc00000, data 0x4cec4a5/0x4e5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2031400 data_alloc: 184549376 data_used: 3940352
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 212 ms_handle_reset con 0x564b598fb000 session 0x564b5774da40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:35.999517+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132702208 unmapped: 36798464 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 212 heartbeat osd_stat(store_statfs(0x1b5a6a000/0x0/0x1bfc00000, data 0x4cee962/0x4e62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:36.999681+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132702208 unmapped: 36798464 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 212 ms_handle_reset con 0x564b59ff9000 session 0x564b59ff6000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 213 ms_handle_reset con 0x564b588db400 session 0x564b5bc98f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:37.999823+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132734976 unmapped: 36765696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:38.999948+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132734976 unmapped: 36765696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.937666893s of 11.566725731s, submitted: 203
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:40.000095+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132743168 unmapped: 36757504 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 214 heartbeat osd_stat(store_statfs(0x1b5a6a000/0x0/0x1bfc00000, data 0x4cf0b39/0x4e64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2042145 data_alloc: 184549376 data_used: 3981312
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:41.000271+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132775936 unmapped: 36724736 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:42.000406+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132792320 unmapped: 36708352 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 215 heartbeat osd_stat(store_statfs(0x1b5a5f000/0x0/0x1bfc00000, data 0x4cf549a/0x4e6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:43.000593+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132825088 unmapped: 36675584 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 ms_handle_reset con 0x564b588db400 session 0x564b5b06da40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:44.000919+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132833280 unmapped: 36667392 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:45.001064+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132833280 unmapped: 36667392 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2053045 data_alloc: 184549376 data_used: 3981312
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 heartbeat osd_stat(store_statfs(0x1b5a5b000/0x0/0x1bfc00000, data 0x4cf77af/0x4e72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:46.001619+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132833280 unmapped: 36667392 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 heartbeat osd_stat(store_statfs(0x1b5a5b000/0x0/0x1bfc00000, data 0x4cf77af/0x4e72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:47.001905+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132833280 unmapped: 36667392 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 ms_handle_reset con 0x564b57d3ec00 session 0x564b5aee3a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 217 ms_handle_reset con 0x564b582e0c00 session 0x564b59ff61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:48.002081+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132923392 unmapped: 36577280 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:49.002350+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132939776 unmapped: 36560896 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:50.002490+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.046837807s of 10.437446594s, submitted: 172
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132947968 unmapped: 36552704 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2059429 data_alloc: 184549376 data_used: 4005888
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:51.002673+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132947968 unmapped: 36552704 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:52.002859+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132956160 unmapped: 36544512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b5a55000/0x0/0x1bfc00000, data 0x4cfbe04/0x4e79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:53.003013+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132956160 unmapped: 36544512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 ms_handle_reset con 0x564b598fb000 session 0x564b58346d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:54.003157+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132956160 unmapped: 36544512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 ms_handle_reset con 0x564b59ff9000 session 0x564b5b06d4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:55.003296+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132956160 unmapped: 36544512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063881 data_alloc: 184549376 data_used: 4005888
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:56.003453+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 ms_handle_reset con 0x564b59ff9000 session 0x564b5b06c1e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132964352 unmapped: 36536320 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:57.003664+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 heartbeat osd_stat(store_statfs(0x1b5a51000/0x0/0x1bfc00000, data 0x4cfbee9/0x4e7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 132964352 unmapped: 36536320 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:58.003826+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134021120 unmapped: 35479552 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:59.003988+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b1b54a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134037504 unmapped: 35463168 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b582e0c00 session 0x564b5833b4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a4c000/0x0/0x1bfc00000, data 0x4cfe0e8/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:00.004222+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.703578949s of 10.006926537s, submitted: 77
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134037504 unmapped: 35463168 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b588db400 session 0x564b5b06c960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fb000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b598fb000 session 0x564b5bc98b40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2073659 data_alloc: 184549376 data_used: 4018176
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:01.004472+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134045696 unmapped: 35454976 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b57d3ec00 session 0x564b5aee3c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a51000/0x0/0x1bfc00000, data 0x4cfdfd9/0x4e7c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b582e0c00 session 0x564b5aee25a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:02.005036+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134037504 unmapped: 35463168 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:03.005218+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134037504 unmapped: 35463168 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:04.005419+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134045696 unmapped: 35454976 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:05.005681+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134053888 unmapped: 35446784 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2071166 data_alloc: 184549376 data_used: 4018176
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:06.005911+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134053888 unmapped: 35446784 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a55000/0x0/0x1bfc00000, data 0x4cfde3f/0x4e78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:07.006180+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134053888 unmapped: 35446784 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:08.006416+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134062080 unmapped: 35438592 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:09.006601+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134062080 unmapped: 35438592 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:10.006846+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134062080 unmapped: 35438592 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2073062 data_alloc: 184549376 data_used: 4018176
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:11.007118+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134062080 unmapped: 35438592 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a53000/0x0/0x1bfc00000, data 0x4cfdf39/0x4e79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.475559235s of 11.720080376s, submitted: 55
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:12.007342+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:13.007515+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a54000/0x0/0x1bfc00000, data 0x4cfdf06/0x4e79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:14.007611+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:15.007748+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2072934 data_alloc: 184549376 data_used: 4018176
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:16.007880+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:17.008068+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 ms_handle_reset con 0x564b588db400 session 0x564b5bc98000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 heartbeat osd_stat(store_statfs(0x1b5a55000/0x0/0x1bfc00000, data 0x4cfde10/0x4e78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:18.008227+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 ms_handle_reset con 0x564b59ff9000 session 0x564b5bc98960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:19.008388+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134070272 unmapped: 35430400 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:20.008520+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134078464 unmapped: 35422208 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2080464 data_alloc: 184549376 data_used: 4030464
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:21.008659+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134078464 unmapped: 35422208 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:22.008804+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588dac00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.245538712s of 10.331121445s, submitted: 18
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 ms_handle_reset con 0x564b588dac00 session 0x564b596752c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134078464 unmapped: 35422208 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b5a4f000/0x0/0x1bfc00000, data 0x4d002c8/0x4e7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:23.008975+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134086656 unmapped: 35414016 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b5a4e000/0x0/0x1bfc00000, data 0x4d0033a/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b57d3ec00 session 0x564b5d73b2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:24.009115+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134111232 unmapped: 35389440 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 heartbeat osd_stat(store_statfs(0x1b5a49000/0x0/0x1bfc00000, data 0x4d026dc/0x4e84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b588db400 session 0x564b5d7bc5a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b582e0c00 session 0x564b59a790e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:25.009284+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b59ff9000 session 0x564b5833fe00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b5b348400 session 0x564b59591c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134119424 unmapped: 35381248 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 heartbeat osd_stat(store_statfs(0x1b5a49000/0x0/0x1bfc00000, data 0x4d026ec/0x4e85000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b5b348400 session 0x564b5aee2780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2092928 data_alloc: 184549376 data_used: 4046848
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:26.009397+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b57d3ec00 session 0x564b5aee3860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134127616 unmapped: 35373056 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 ms_handle_reset con 0x564b582e0c00 session 0x564b5b6e0f00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:27.009604+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 ms_handle_reset con 0x564b588db400 session 0x564b5e30bc20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134168576 unmapped: 35332096 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:28.009760+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59ff9000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 ms_handle_reset con 0x564b59ff9000 session 0x564b59674000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134242304 unmapped: 35258368 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 ms_handle_reset con 0x564b57d3ec00 session 0x564b5bc99860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:29.009907+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 48
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134258688 unmapped: 35241984 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 ms_handle_reset con 0x564b582e0c00 session 0x564b57c7ba40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:30.010025+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134266880 unmapped: 35233792 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2100943 data_alloc: 184549376 data_used: 4055040
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:31.010204+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 heartbeat osd_stat(store_statfs(0x1b5a43000/0x0/0x1bfc00000, data 0x4d04d78/0x4e8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134291456 unmapped: 35209216 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b588db400 session 0x564b5b6e14a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fdec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:32.010345+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.356479645s of 10.005580902s, submitted: 132
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b59fdec00 session 0x564b5d73ba40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b5b348400 session 0x564b583452c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134324224 unmapped: 35176448 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b5b348400 session 0x564b5dd363c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:33.010491+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b1b41e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134348800 unmapped: 35151872 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 heartbeat osd_stat(store_statfs(0x1b5a3e000/0x0/0x1bfc00000, data 0x4d06fef/0x4e8f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:34.010732+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 ms_handle_reset con 0x564b582e0c00 session 0x564b5b06d2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 heartbeat osd_stat(store_statfs(0x1b5a3e000/0x0/0x1bfc00000, data 0x4d06f8d/0x4e8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134348800 unmapped: 35151872 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 ms_handle_reset con 0x564b588db400 session 0x564b59aa7e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 heartbeat osd_stat(store_statfs(0x1b5a3e000/0x0/0x1bfc00000, data 0x4d06f8d/0x4e8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:35.010921+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b59fdec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 ms_handle_reset con 0x564b59fdec00 session 0x564b596703c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134365184 unmapped: 35135488 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a61e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 heartbeat osd_stat(store_statfs(0x1b5a3c000/0x0/0x1bfc00000, data 0x4d09331/0x4e91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2108390 data_alloc: 184549376 data_used: 4079616
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:36.011065+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134389760 unmapped: 35110912 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:37.011311+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134389760 unmapped: 35110912 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:38.011490+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134397952 unmapped: 35102720 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:39.011632+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 225 heartbeat osd_stat(store_statfs(0x1b5a39000/0x0/0x1bfc00000, data 0x4d0b678/0x4e94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134406144 unmapped: 35094528 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:40.011798+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134430720 unmapped: 35069952 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2117240 data_alloc: 184549376 data_used: 4079616
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:41.011993+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134430720 unmapped: 35069952 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:42.012151+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 134430720 unmapped: 35069952 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.461457253s of 10.860767365s, submitted: 102
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:43.012299+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 226 heartbeat osd_stat(store_statfs(0x1b5a36000/0x0/0x1bfc00000, data 0x4d0b75c/0x4e96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 135487488 unmapped: 34013184 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:44.012727+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 135495680 unmapped: 34004992 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:45.012979+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 226 heartbeat osd_stat(store_statfs(0x1b4892000/0x0/0x1bfc00000, data 0x4d0d9e5/0x4e9a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136544256 unmapped: 32956416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:46.013165+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2124902 data_alloc: 184549376 data_used: 4091904
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 227 heartbeat osd_stat(store_statfs(0x1b488f000/0x0/0x1bfc00000, data 0x4d0fdf0/0x4e9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:47.013339+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:48.013545+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:49.013739+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 227 ms_handle_reset con 0x564b582e0c00 session 0x564b585b9e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:50.013911+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:51.014134+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2125052 data_alloc: 184549376 data_used: 4091904
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136552448 unmapped: 32948224 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 228 ms_handle_reset con 0x564b588db400 session 0x564b5aee23c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:52.014366+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 228 heartbeat osd_stat(store_statfs(0x1b488b000/0x0/0x1bfc00000, data 0x4d12175/0x4ea2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136585216 unmapped: 32915456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:53.014542+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.087131500s of 10.309572220s, submitted: 100
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136593408 unmapped: 32907264 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 ms_handle_reset con 0x564b5a2c0000 session 0x564b585b9c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 ms_handle_reset con 0x564b5b348400 session 0x564b57a8a3c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:54.014763+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136585216 unmapped: 32915456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 ms_handle_reset con 0x564b5b348400 session 0x564b585b8000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:55.014931+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 ms_handle_reset con 0x564b57d3ec00 session 0x564b57a91e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136609792 unmapped: 32890880 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 ms_handle_reset con 0x564b582e0c00 session 0x564b59a79a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:56.015090+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2137613 data_alloc: 184549376 data_used: 4104192
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136617984 unmapped: 32882688 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 heartbeat osd_stat(store_statfs(0x1b4887000/0x0/0x1bfc00000, data 0x4d14496/0x4ea7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 ms_handle_reset con 0x564b588db400 session 0x564b598b81e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:57.015354+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 ms_handle_reset con 0x564b5a2c0000 session 0x564b5c1a7c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136626176 unmapped: 32874496 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 ms_handle_reset con 0x564b5a2c0000 session 0x564b57a8a960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:58.015487+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136658944 unmapped: 32841728 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:59.015680+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136658944 unmapped: 32841728 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 heartbeat osd_stat(store_statfs(0x1b4482000/0x0/0x1bfc00000, data 0x4d168c8/0x4eab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:00.015806+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 ms_handle_reset con 0x564b57d3ec00 session 0x564b5bc43680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136683520 unmapped: 32817152 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:01.015996+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2149021 data_alloc: 184549376 data_used: 4128768
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 231 ms_handle_reset con 0x564b582e0c00 session 0x564b59ff7e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:02.016195+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:03.016361+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:04.016515+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b447b000/0x0/0x1bfc00000, data 0x4d18d2a/0x4eb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.158406258s of 11.766262054s, submitted: 166
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:05.016623+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:06.016762+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b447b000/0x0/0x1bfc00000, data 0x4d18d2a/0x4eb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2151833 data_alloc: 184549376 data_used: 4128768
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:07.016978+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136708096 unmapped: 32792576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:08.017123+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136724480 unmapped: 32776192 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:09.017342+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136757248 unmapped: 32743424 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 233 ms_handle_reset con 0x564b588db400 session 0x564b57c7b680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:10.017488+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136798208 unmapped: 32702464 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 234 ms_handle_reset con 0x564b5b348400 session 0x564b59675e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:11.017625+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2165499 data_alloc: 184549376 data_used: 4157440
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 234 heartbeat osd_stat(store_statfs(0x1b446f000/0x0/0x1bfc00000, data 0x4d1f96b/0x4ebe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 234 ms_handle_reset con 0x564b5b348400 session 0x564b5bc99a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136814592 unmapped: 32686080 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:12.017766+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136822784 unmapped: 32677888 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 235 ms_handle_reset con 0x564b57d3ec00 session 0x564b5b6e0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 235 ms_handle_reset con 0x564b582e0c00 session 0x564b5a30d2c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:13.017925+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136839168 unmapped: 32661504 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 ms_handle_reset con 0x564b588db400 session 0x564b5b6e0960
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:14.018085+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 heartbeat osd_stat(store_statfs(0x1b4465000/0x0/0x1bfc00000, data 0x4d24291/0x4ec7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136863744 unmapped: 32636928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.659623146s of 10.000049591s, submitted: 145
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:15.018236+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 ms_handle_reset con 0x564b5a2c0000 session 0x564b59ff74a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a7860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136863744 unmapped: 32636928 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 ms_handle_reset con 0x564b582e0c00 session 0x564b584925a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 ms_handle_reset con 0x564b588db400 session 0x564b5d7bd860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:16.018412+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2170825 data_alloc: 184549376 data_used: 4153344
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136912896 unmapped: 32587776 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:17.018605+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136912896 unmapped: 32587776 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 236 handle_osd_map epochs [236,237], i have 237, src has [1,237]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 237 ms_handle_reset con 0x564b5b348400 session 0x564b57c83c20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:18.018770+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 237 heartbeat osd_stat(store_statfs(0x1b4465000/0x0/0x1bfc00000, data 0x4d2645f/0x4ec8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136945664 unmapped: 32555008 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:19.018944+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b583a2400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136945664 unmapped: 32555008 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:20.019124+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 238 ms_handle_reset con 0x564b583a2400 session 0x564b59590d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136953856 unmapped: 32546816 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:21.019693+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2182111 data_alloc: 184549376 data_used: 4182016
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 238 ms_handle_reset con 0x564b57d3ec00 session 0x564b5c1a65a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136962048 unmapped: 32538624 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:22.021313+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 238 heartbeat osd_stat(store_statfs(0x1b4462000/0x0/0x1bfc00000, data 0x4d286bf/0x4ecb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 136994816 unmapped: 32505856 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 239 ms_handle_reset con 0x564b582e0c00 session 0x564b5b6e0d20
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b4459000/0x0/0x1bfc00000, data 0x4d2cd67/0x4ed3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:23.021739+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 137052160 unmapped: 32448512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 ms_handle_reset con 0x564b588db400 session 0x564b5c1a7860
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5b348400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:24.021928+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 ms_handle_reset con 0x564b5b348400 session 0x564b59ff74a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 137117696 unmapped: 32382976 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b4458000/0x0/0x1bfc00000, data 0x4d2ce02/0x4ed4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.388644218s of 10.003359795s, submitted: 168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:25.022067+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 137347072 unmapped: 32153600 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:26.022462+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2202153 data_alloc: 184549376 data_used: 4190208
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 137478144 unmapped: 32022528 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:27.022793+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 137478144 unmapped: 32022528 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:28.023012+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138846208 unmapped: 30654464 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:29.023270+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b43d5000/0x0/0x1bfc00000, data 0x4db35bc/0x4f59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138919936 unmapped: 30580736 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:30.023435+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b43c6000/0x0/0x1bfc00000, data 0x4dc23f8/0x4f68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 139272192 unmapped: 30228480 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:31.023715+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2210077 data_alloc: 184549376 data_used: 4190208
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 ms_handle_reset con 0x564b5a2c1400 session 0x564b5b6e0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138625024 unmapped: 30875648 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:32.023909+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138731520 unmapped: 30769152 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:33.024070+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b5a2c1400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 ms_handle_reset con 0x564b5a2c1400 session 0x564b5774f4a0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b57d3ec00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 ms_handle_reset con 0x564b57d3ec00 session 0x564b59675e00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 138862592 unmapped: 30638080 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:34.024271+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 139231232 unmapped: 30269440 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.239533424s of 10.002458572s, submitted: 197
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:35.024400+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 242 heartbeat osd_stat(store_statfs(0x1b4303000/0x0/0x1bfc00000, data 0x4e82d8e/0x502a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141385728 unmapped: 28114944 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:36.024602+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 49
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2226697 data_alloc: 184549376 data_used: 4218880
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141459456 unmapped: 28041216 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:37.024911+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141828096 unmapped: 27672576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:38.025115+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 140926976 unmapped: 28573696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:39.025265+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 140926976 unmapped: 28573696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:40.025404+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 242 heartbeat osd_stat(store_statfs(0x1b425a000/0x0/0x1bfc00000, data 0x4f2f732/0x50d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141246464 unmapped: 28254208 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:41.025604+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2222673 data_alloc: 184549376 data_used: 4218880
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 141254656 unmapped: 28246016 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:42.025789+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142303232 unmapped: 27197440 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:43.025976+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142516224 unmapped: 26984448 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:44.026133+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142516224 unmapped: 26984448 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.612854004s of 10.001272202s, submitted: 91
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:45.026286+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142557184 unmapped: 26943488 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:46.026449+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2236665 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b41a6000/0x0/0x1bfc00000, data 0x4fe155e/0x5188000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142565376 unmapped: 26935296 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:47.026681+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142565376 unmapped: 26935296 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:48.026820+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 142843904 unmapped: 26656768 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:49.027002+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b413d000/0x0/0x1bfc00000, data 0x504be2a/0x51f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143515648 unmapped: 25985024 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:50.027143+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143589376 unmapped: 25911296 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:51.027391+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2244799 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143753216 unmapped: 25747456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:52.027569+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143040512 unmapped: 26460160 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:53.027716+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b40b4000/0x0/0x1bfc00000, data 0x50d48f0/0x527a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143327232 unmapped: 26173440 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:54.027931+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143327232 unmapped: 26173440 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.588497162s of 10.000463486s, submitted: 78
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:55.028757+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4087000/0x0/0x1bfc00000, data 0x5101b4d/0x52a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143532032 unmapped: 25968640 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:56.028958+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2255387 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143532032 unmapped: 25968640 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4056000/0x0/0x1bfc00000, data 0x5133332/0x52d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:57.029171+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143532032 unmapped: 25968640 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:58.029323+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143704064 unmapped: 25796608 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:59.029504+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143704064 unmapped: 25796608 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:00.029633+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143704064 unmapped: 25796608 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:01.029813+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2253611 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4053000/0x0/0x1bfc00000, data 0x5133495/0x52da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143712256 unmapped: 25788416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:02.029999+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4054000/0x0/0x1bfc00000, data 0x5133493/0x52da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143712256 unmapped: 25788416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:03.030174+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143712256 unmapped: 25788416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:04.030401+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 143712256 unmapped: 25788416 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:05.030566+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.122197151s of 10.201749802s, submitted: 16
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144441344 unmapped: 25059328 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3951898632' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:06.030761+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2252745 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4055000/0x0/0x1bfc00000, data 0x51333cd/0x52d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144441344 unmapped: 25059328 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:07.031047+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144441344 unmapped: 25059328 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 50
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:08.031214+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 25051136 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:09.031340+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 25051136 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:10.031557+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144457728 unmapped: 25042944 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:11.031737+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2258775 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 25026560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:12.031966+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4051000/0x0/0x1bfc00000, data 0x51335cb/0x52dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 25026560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:13.032129+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 25026560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:14.032280+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 25026560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:15.032420+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.051443100s of 10.165925980s, submitted: 258
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 25026560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:16.032589+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2262861 data_alloc: 184549376 data_used: 4231168
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144482304 unmapped: 25018368 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:17.032764+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4050000/0x0/0x1bfc00000, data 0x513372d/0x52de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144482304 unmapped: 25018368 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:18.032963+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144490496 unmapped: 25010176 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:19.033160+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144498688 unmapped: 25001984 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:20.033322+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 heartbeat osd_stat(store_statfs(0x1b4050000/0x0/0x1bfc00000, data 0x5133665/0x52dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144498688 unmapped: 25001984 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:21.033474+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266585 data_alloc: 184549376 data_used: 4243456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 144523264 unmapped: 24977408 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:22.033662+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 145571840 unmapped: 23928832 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:23.033826+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 145571840 unmapped: 23928832 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:24.034031+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 145620992 unmapped: 23879680 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:25.034204+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146038784 unmapped: 23461888 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:26.034588+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 244 heartbeat osd_stat(store_statfs(0x1b3ff0000/0x0/0x1bfc00000, data 0x5195181/0x533e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2275577 data_alloc: 184549376 data_used: 4243456
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146046976 unmapped: 23453696 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:27.035084+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.735066414s of 12.083557129s, submitted: 74
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146112512 unmapped: 23388160 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:28.035727+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146259968 unmapped: 23240704 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:29.037361+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146259968 unmapped: 23240704 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:30.037797+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146374656 unmapped: 23126016 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:31.037966+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3f68000/0x0/0x1bfc00000, data 0x521ba51/0x53c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2289425 data_alloc: 184549376 data_used: 4255744
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146989056 unmapped: 22511616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:32.038482+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3f2e000/0x0/0x1bfc00000, data 0x5254359/0x53ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146989056 unmapped: 22511616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:33.039297+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 146989056 unmapped: 22511616 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:34.039828+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148299776 unmapped: 21200896 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:35.040229+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 147726336 unmapped: 21774336 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3ebd000/0x0/0x1bfc00000, data 0x52c6452/0x5471000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:36.040574+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2299441 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 147726336 unmapped: 21774336 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:37.040820+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.505228043s of 10.005705833s, submitted: 104
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148873216 unmapped: 20627456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:38.041221+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148873216 unmapped: 20627456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:39.041485+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148873216 unmapped: 20627456 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:40.041663+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148316160 unmapped: 21184512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:41.041808+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3e14000/0x0/0x1bfc00000, data 0x536e3c1/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2308707 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148316160 unmapped: 21184512 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:42.042044+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148324352 unmapped: 21176320 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:43.042224+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 148807680 unmapped: 20692992 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:44.042440+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 149921792 unmapped: 19578880 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:45.042646+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 149962752 unmapped: 19537920 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:46.042861+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2321173 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150077440 unmapped: 19423232 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:47.043087+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3d61000/0x0/0x1bfc00000, data 0x54205e5/0x55cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.585545540s of 10.001330376s, submitted: 89
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150093824 unmapped: 19406848 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:48.043276+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150093824 unmapped: 19406848 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:49.043478+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150355968 unmapped: 19144704 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:50.205191+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150618112 unmapped: 18882560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:51.205354+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2330089 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3c95000/0x0/0x1bfc00000, data 0x54ed5d1/0x5699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150618112 unmapped: 18882560 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:52.205501+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3c95000/0x0/0x1bfc00000, data 0x54ed5d1/0x5699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 151044096 unmapped: 18456576 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:53.205649+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150863872 unmapped: 18636800 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:54.205836+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150872064 unmapped: 18628608 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:55.206009+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3c2b000/0x0/0x1bfc00000, data 0x5558038/0x5703000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 151126016 unmapped: 18374656 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:56.206167+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2329293 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152174592 unmapped: 17326080 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:57.206381+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 17211392 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:58.206561+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152788992 unmapped: 16711680 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:59.206797+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.844199181s of 12.252660751s, submitted: 85
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152788992 unmapped: 16711680 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:00.206988+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 18505728 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:01.207222+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2347731 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3b60000/0x0/0x1bfc00000, data 0x5621daa/0x57cd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 151339008 unmapped: 18161664 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:02.207476+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 151339008 unmapped: 18161664 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:03.207649+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152395776 unmapped: 17104896 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:04.207795+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152846336 unmapped: 16654336 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:05.207947+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152567808 unmapped: 16932864 heap: 169500672 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b582e0c00
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:06.208084+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 heartbeat osd_stat(store_statfs(0x1b3aed000/0x0/0x1bfc00000, data 0x5694602/0x5841000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2380667 data_alloc: 184549376 data_used: 4263936
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 246 ms_handle_reset con 0x564b582e0c00 session 0x564b598b81e0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152584192 unmapped: 21577728 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b588db400
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:07.208184+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 246 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152608768 unmapped: 21553152 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 247 ms_handle_reset con 0x564b588db400 session 0x564b59a79a40
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:08.208310+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152616960 unmapped: 21544960 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:09.208594+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 152616960 unmapped: 21544960 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.604196548s of 10.285016060s, submitted: 148
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:10.208695+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153780224 unmapped: 20381696 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:11.208861+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2367788 data_alloc: 184549376 data_used: 4288512
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153780224 unmapped: 20381696 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:12.209028+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 247 heartbeat osd_stat(store_statfs(0x1b3a67000/0x0/0x1bfc00000, data 0x5718dab/0x58c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153780224 unmapped: 20381696 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:13.209163+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153944064 unmapped: 20217856 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:14.209288+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154066944 unmapped: 20094976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:15.209459+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 247 heartbeat osd_stat(store_statfs(0x1b3a33000/0x0/0x1bfc00000, data 0x574e0df/0x58fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154066944 unmapped: 20094976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:16.209617+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2369448 data_alloc: 184549376 data_used: 4288512
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154198016 unmapped: 19963904 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:17.209814+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154214400 unmapped: 19947520 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:18.209949+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154148864 unmapped: 20013056 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:19.210083+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.760478020s of 10.000183105s, submitted: 63
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154066944 unmapped: 20094976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:20.210238+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154066944 unmapped: 20094976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b39f5000/0x0/0x1bfc00000, data 0x5786e9b/0x5937000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [0,0,1,2])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:21.210396+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2381786 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154083328 unmapped: 20078592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:22.210536+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154304512 unmapped: 19857408 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:23.210778+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154304512 unmapped: 19857408 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:24.210992+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154689536 unmapped: 19472384 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:25.211156+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3579000/0x0/0x1bfc00000, data 0x580534c/0x59b5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154828800 unmapped: 19333120 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:26.211364+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3579000/0x0/0x1bfc00000, data 0x580534c/0x59b5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2384146 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154828800 unmapped: 19333120 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:27.211599+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154828800 unmapped: 19333120 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3579000/0x0/0x1bfc00000, data 0x5805285/0x59b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:28.211806+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154836992 unmapped: 19324928 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:29.211981+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3551000/0x0/0x1bfc00000, data 0x582e1db/0x59dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.779703140s of 10.000446320s, submitted: 42
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154836992 unmapped: 19324928 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:30.212151+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154836992 unmapped: 19324928 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:31.212341+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3551000/0x0/0x1bfc00000, data 0x582e659/0x59dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2381842 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154968064 unmapped: 19193856 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:32.212496+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154968064 unmapped: 19193856 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:33.212699+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154968064 unmapped: 19193856 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:34.212942+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b352d000/0x0/0x1bfc00000, data 0x58530ca/0x5a01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153427968 unmapped: 20733952 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:35.213175+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b350c000/0x0/0x1bfc00000, data 0x5873dca/0x5a22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153427968 unmapped: 20733952 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:36.213400+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2385442 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153427968 unmapped: 20733952 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b350c000/0x0/0x1bfc00000, data 0x5873dca/0x5a22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:37.213671+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153427968 unmapped: 20733952 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:38.213950+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153427968 unmapped: 20733952 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:39.214127+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.918416023s of 10.000082970s, submitted: 18
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153526272 unmapped: 20635648 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:40.214254+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153608192 unmapped: 20553728 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:41.214396+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2385474 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b34f4000/0x0/0x1bfc00000, data 0x588c1e9/0x5a3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153608192 unmapped: 20553728 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:42.214540+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b34f4000/0x0/0x1bfc00000, data 0x588c1e9/0x5a3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153608192 unmapped: 20553728 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:43.214685+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153739264 unmapped: 20422656 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:44.214817+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153739264 unmapped: 20422656 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:45.214962+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b34be000/0x0/0x1bfc00000, data 0x58c0b37/0x5a70000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153739264 unmapped: 20422656 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:46.215182+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2392562 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b34be000/0x0/0x1bfc00000, data 0x58c0b37/0x5a70000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153821184 unmapped: 20340736 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:47.215340+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153837568 unmapped: 20324352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:48.215526+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b349d000/0x0/0x1bfc00000, data 0x58e0684/0x5a91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b349d000/0x0/0x1bfc00000, data 0x58e0684/0x5a91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153837568 unmapped: 20324352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:49.215709+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.893459320s of 10.000332832s, submitted: 21
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153837568 unmapped: 20324352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:50.215934+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b349d000/0x0/0x1bfc00000, data 0x58e0684/0x5a91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153845760 unmapped: 20316160 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:51.216078+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2394410 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153845760 unmapped: 20316160 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:52.216317+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153845760 unmapped: 20316160 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:53.216475+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 153845760 unmapped: 20316160 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:54.216659+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154992640 unmapped: 19169280 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:55.216795+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b344e000/0x0/0x1bfc00000, data 0x592fc99/0x5ae0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154992640 unmapped: 19169280 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:56.216899+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2400518 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 154992640 unmapped: 19169280 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:57.217059+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155090944 unmapped: 19070976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:58.217217+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155090944 unmapped: 19070976 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:59.217392+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.902788162s of 10.005618095s, submitted: 22
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155099136 unmapped: 19062784 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:00.217542+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3408000/0x0/0x1bfc00000, data 0x5976881/0x5b26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155099136 unmapped: 19062784 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:01.217686+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2402814 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155099136 unmapped: 19062784 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:02.217890+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:03.218001+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:04.218136+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:05.218305+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:06.218577+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33c8000/0x0/0x1bfc00000, data 0x59b82cc/0x5b66000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2404722 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:07.218788+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:08.218996+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33c8000/0x0/0x1bfc00000, data 0x59b82cc/0x5b66000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:09.219343+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.924438477s of 10.000024796s, submitted: 16
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:10.220262+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33b9000/0x0/0x1bfc00000, data 0x59c69b6/0x5b75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:11.220573+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2405350 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33b9000/0x0/0x1bfc00000, data 0x59c69b6/0x5b75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:12.220711+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33b9000/0x0/0x1bfc00000, data 0x59c69b6/0x5b75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:13.221212+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155107328 unmapped: 19054592 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:14.221398+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155115520 unmapped: 19046400 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:15.221585+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b33a5000/0x0/0x1bfc00000, data 0x59d9f1e/0x5b89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155230208 unmapped: 18931712 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:16.221766+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155230208 unmapped: 18931712 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2409158 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:17.221943+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155336704 unmapped: 18825216 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:18.222093+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155336704 unmapped: 18825216 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b337b000/0x0/0x1bfc00000, data 0x5a03ff8/0x5bb3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:19.222389+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155336704 unmapped: 18825216 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.903597832s of 10.003234863s, submitted: 21
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:20.222519+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155516928 unmapped: 18644992 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:21.222779+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155525120 unmapped: 18636800 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3343000/0x0/0x1bfc00000, data 0x5a3b049/0x5beb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2416102 data_alloc: 184549376 data_used: 4300800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:22.223003+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 155525120 unmapped: 18636800 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:23.223265+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156786688 unmapped: 17375232 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 heartbeat osd_stat(store_statfs(0x1b3331000/0x0/0x1bfc00000, data 0x5a4d3c4/0x5bfd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:24.223463+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156901376 unmapped: 17260544 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:25.223665+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156909568 unmapped: 17252352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:26.223827+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156909568 unmapped: 17252352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 249 heartbeat osd_stat(store_statfs(0x1b330c000/0x0/0x1bfc00000, data 0x5a714e1/0x5c22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2417510 data_alloc: 184549376 data_used: 4313088
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:27.224034+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156327936 unmapped: 17833984 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 249 heartbeat osd_stat(store_statfs(0x1b213d000/0x0/0x1bfc00000, data 0x5a9ecc1/0x5c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:28.224187+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156426240 unmapped: 17735680 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b213d000/0x0/0x1bfc00000, data 0x5a9ecc1/0x5c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:29.224303+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156426240 unmapped: 17735680 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.661595345s of 10.001096725s, submitted: 122
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:30.224457+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156672000 unmapped: 17489920 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:31.224599+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156672000 unmapped: 17489920 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2433254 data_alloc: 184549376 data_used: 4329472
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:32.224777+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156966912 unmapped: 17195008 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b20ec000/0x0/0x1bfc00000, data 0x5aedf42/0x5ca2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:33.224971+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156975104 unmapped: 17186816 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b20ec000/0x0/0x1bfc00000, data 0x5aedf42/0x5ca2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [0,1])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:34.225121+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 156647424 unmapped: 17514496 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:35.225275+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158769152 unmapped: 15392768 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:36.225447+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158777344 unmapped: 15384576 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2431930 data_alloc: 184549376 data_used: 4333568
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:37.225644+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158777344 unmapped: 15384576 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:38.225771+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158900224 unmapped: 15261696 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:39.225953+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158900224 unmapped: 15261696 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b0edf000/0x0/0x1bfc00000, data 0x5b5ba17/0x5d0f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.791142464s of 10.000650406s, submitted: 46
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b0edf000/0x0/0x1bfc00000, data 0x5b5ba17/0x5d0f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:40.226151+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159080448 unmapped: 15081472 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:41.226285+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158957568 unmapped: 15204352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 251 heartbeat osd_stat(store_statfs(0x1b0ea7000/0x0/0x1bfc00000, data 0x5b9210f/0x5d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2440034 data_alloc: 184549376 data_used: 4345856
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:42.226417+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158957568 unmapped: 15204352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:43.226575+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158957568 unmapped: 15204352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 251 heartbeat osd_stat(store_statfs(0x1b0e91000/0x0/0x1bfc00000, data 0x5ba8e1b/0x5d5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:44.226721+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 158957568 unmapped: 15204352 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 251 heartbeat osd_stat(store_statfs(0x1b0e91000/0x0/0x1bfc00000, data 0x5ba8e1b/0x5d5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b598fa800
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:45.226863+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159162368 unmapped: 14999552 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 51
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:46.227034+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159334400 unmapped: 14827520 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2459462 data_alloc: 184549376 data_used: 4345856
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:47.227179+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159334400 unmapped: 14827520 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:48.227306+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159334400 unmapped: 14827520 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0e5d000/0x0/0x1bfc00000, data 0x5bd9b67/0x5d91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:49.227490+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 159334400 unmapped: 14827520 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.710847855s of 10.002344131s, submitted: 90
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:50.227632+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 161767424 unmapped: 12394496 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:51.227813+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 161767424 unmapped: 12394496 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0e0e000/0x0/0x1bfc00000, data 0x5c26fac/0x5de0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2464754 data_alloc: 184549376 data_used: 4358144
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:52.228005+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 161832960 unmapped: 12328960 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:53.228190+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162062336 unmapped: 12099584 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0dfa000/0x0/0x1bfc00000, data 0x5c3b0c8/0x5df4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:54.228362+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162062336 unmapped: 12099584 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:55.228513+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162193408 unmapped: 11968512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0ddc000/0x0/0x1bfc00000, data 0x5c591f5/0x5e12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:56.228659+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162193408 unmapped: 11968512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2462544 data_alloc: 184549376 data_used: 4358144
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:57.228852+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162193408 unmapped: 11968512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0dce000/0x0/0x1bfc00000, data 0x5c67fcc/0x5e20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:58.229043+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162193408 unmapped: 11968512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:59.229204+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162193408 unmapped: 11968512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.886539459s of 10.002117157s, submitted: 24
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:00.229331+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162480128 unmapped: 11681792 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0dce000/0x0/0x1bfc00000, data 0x5c67fcc/0x5e20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:01.229494+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162480128 unmapped: 11681792 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2471276 data_alloc: 184549376 data_used: 4358144
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:02.229672+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 heartbeat osd_stat(store_statfs(0x1b0d56000/0x0/0x1bfc00000, data 0x5cdf155/0x5e98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162807808 unmapped: 11354112 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:03.229936+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162816000 unmapped: 11345920 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:04.230113+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162816000 unmapped: 11345920 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:05.230259+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 162816000 unmapped: 11345920 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:06.230463+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 163799040 unmapped: 10362880 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2480176 data_alloc: 184549376 data_used: 4370432
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:07.230708+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 163799040 unmapped: 10362880 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 253 heartbeat osd_stat(store_statfs(0x1b0cfb000/0x0/0x1bfc00000, data 0x5d377c1/0x5ef2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:08.230826+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 253 heartbeat osd_stat(store_statfs(0x1b0cfc000/0x0/0x1bfc00000, data 0x5d37726/0x5ef1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164855808 unmapped: 9306112 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:09.231034+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164855808 unmapped: 9306112 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.584746361s of 10.000225067s, submitted: 134
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:10.231177+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164945920 unmapped: 9216000 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:11.231298+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164102144 unmapped: 10059776 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2490090 data_alloc: 184549376 data_used: 4370432
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:12.231481+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164102144 unmapped: 10059776 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:13.231613+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164241408 unmapped: 9920512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 255 heartbeat osd_stat(store_statfs(0x1b0c7b000/0x0/0x1bfc00000, data 0x5db3ef0/0x5f72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:14.231759+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164241408 unmapped: 9920512 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 255 heartbeat osd_stat(store_statfs(0x1b0c7b000/0x0/0x1bfc00000, data 0x5db3ef0/0x5f72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:15.231921+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164151296 unmapped: 10010624 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:16.232067+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164151296 unmapped: 10010624 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2495864 data_alloc: 184549376 data_used: 4370432
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:17.232260+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164151296 unmapped: 10010624 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:18.232421+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164159488 unmapped: 10002432 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 256 heartbeat osd_stat(store_statfs(0x1b0c5b000/0x0/0x1bfc00000, data 0x5dd6ca3/0x5f93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:19.232560+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164175872 unmapped: 9986048 heap: 174161920 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.750893593s of 10.004892349s, submitted: 103
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:20.232727+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164306944 unmapped: 10903552 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:21.232925+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164306944 unmapped: 10903552 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2505414 data_alloc: 184549376 data_used: 4382720
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:22.233071+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164306944 unmapped: 10903552 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 257 heartbeat osd_stat(store_statfs(0x1b0c12000/0x0/0x1bfc00000, data 0x5e1ca02/0x5fdc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:23.233200+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164577280 unmapped: 10633216 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:24.233350+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164839424 unmapped: 10371072 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:25.233518+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164839424 unmapped: 10371072 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:26.233639+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164700160 unmapped: 10510336 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2507104 data_alloc: 184549376 data_used: 4395008
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:27.233847+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 258 heartbeat osd_stat(store_statfs(0x1b0be9000/0x0/0x1bfc00000, data 0x5e432db/0x6004000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164700160 unmapped: 10510336 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:28.233983+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164790272 unmapped: 10420224 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:29.234168+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164790272 unmapped: 10420224 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.837017059s of 10.000536919s, submitted: 75
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:30.234348+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164782080 unmapped: 10428416 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:31.234443+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164782080 unmapped: 10428416 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:32.234581+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2516630 data_alloc: 184549376 data_used: 4395008
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164790272 unmapped: 10420224 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:33.234711+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 259 heartbeat osd_stat(store_statfs(0x1b0b8f000/0x0/0x1bfc00000, data 0x5e9bc79/0x605f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164954112 unmapped: 10256384 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:34.234919+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 259 heartbeat osd_stat(store_statfs(0x1b0b55000/0x0/0x1bfc00000, data 0x5ed4b9b/0x6099000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164986880 unmapped: 10223616 heap: 175210496 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:35.235052+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 164995072 unmapped: 11264000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _renew_subs
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:36.235187+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 handle_osd_map epochs [260,261], i have 261, src has [1,261]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 handle_osd_map epochs [260,261], i have 261, src has [1,261]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165158912 unmapped: 11100160 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:37.235374+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2525988 data_alloc: 184549376 data_used: 4407296
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165158912 unmapped: 11100160 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:38.235566+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 heartbeat osd_stat(store_statfs(0x1b0b23000/0x0/0x1bfc00000, data 0x5f04037/0x60c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165158912 unmapped: 11100160 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 heartbeat osd_stat(store_statfs(0x1b0b23000/0x0/0x1bfc00000, data 0x5f04037/0x60c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:39.235781+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165158912 unmapped: 11100160 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.662960052s of 10.003593445s, submitted: 95
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:40.236019+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165167104 unmapped: 11091968 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:41.236405+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 heartbeat osd_stat(store_statfs(0x1b0afa000/0x0/0x1bfc00000, data 0x5f2e59e/0x60f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165273600 unmapped: 10985472 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:42.237042+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2528240 data_alloc: 184549376 data_used: 4407296
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165273600 unmapped: 10985472 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 heartbeat osd_stat(store_statfs(0x1b0afa000/0x0/0x1bfc00000, data 0x5f2e59e/0x60f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 261 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:43.237256+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165281792 unmapped: 10977280 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:44.237549+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165281792 unmapped: 10977280 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:45.237658+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:46.237877+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae7000/0x0/0x1bfc00000, data 0x5f3ffea/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:47.238067+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530482 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:48.238257+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:49.238630+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae7000/0x0/0x1bfc00000, data 0x5f3ffea/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:50.238837+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:51.239094+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:52.239467+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530930 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:53.239687+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:54.240138+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165396480 unmapped: 10862592 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:55.240478+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:56.241021+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:57.241346+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530930 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:58.241681+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:59.241981+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:00.242179+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:01.242323+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:02.242502+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530930 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:03.242690+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:04.242932+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f45ee9/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165404672 unmapped: 10854400 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 25.156999588s of 25.203319550s, submitted: 21
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:05.243100+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 ms_handle_reset con 0x564b598fa800 session 0x564b5dd363c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 165871616 unmapped: 10387456 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 52
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:06.243281+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:07.243496+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:08.243660+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:09.243838+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:10.244029+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:11.244201+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:12.244392+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:13.244569+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:14.244746+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:15.244923+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:16.245067+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:17.245248+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:18.245359+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166019072 unmapped: 10240000 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:19.245511+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166043648 unmapped: 10215424 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:20.245712+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166043648 unmapped: 10215424 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:21.245961+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:22.246170+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:23.246382+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:24.246556+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:25.246726+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:26.246897+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166051840 unmapped: 10207232 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:27.247086+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166068224 unmapped: 10190848 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:28.247273+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166076416 unmapped: 10182656 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:29.247575+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:30.247722+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:31.247889+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:32.248033+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:33.248178+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:34.248371+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166084608 unmapped: 10174464 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:35.248501+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166092800 unmapped: 10166272 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:36.248705+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 21K writes, 7360 syncs, 2.95 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 33.66 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4546 syncs, 2.50 writes per sync, written: 0.03 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166092800 unmapped: 10166272 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:37.248910+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:38.249060+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:39.249244+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:40.249419+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:41.249621+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:42.249746+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166100992 unmapped: 10158080 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:43.250459+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:44.250670+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:45.251864+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:46.252952+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:47.253204+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:48.253896+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:49.254052+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:50.254622+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166109184 unmapped: 10149888 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:51.254968+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:52.255246+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:53.255517+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:54.255715+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:55.255901+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:56.256337+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:57.256579+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166117376 unmapped: 10141696 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:58.256817+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166125568 unmapped: 10133504 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:59.256981+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:00.257164+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:01.257411+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:02.257595+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530066 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:03.257955+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:04.258155+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f4608f/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166133760 unmapped: 10125312 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 59.828536987s of 59.880477905s, submitted: 246
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 ms_handle_reset con 0x564b57d3f400 session 0x564b59675680
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:05.258306+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166690816 unmapped: 9568256 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:06.258528+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Got map version 53
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166707200 unmapped: 9551872 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:07.258830+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:08.259109+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:09.259302+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:10.259550+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:11.259733+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:12.259954+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:13.260300+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166715392 unmapped: 9543680 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:14.260506+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166723584 unmapped: 9535488 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:15.260708+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166723584 unmapped: 9535488 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:16.260863+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166723584 unmapped: 9535488 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:17.261141+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166731776 unmapped: 9527296 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:18.261269+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:19.261455+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:20.261608+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:21.261812+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:22.262009+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:23.262178+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:24.262378+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166739968 unmapped: 9519104 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:25.262519+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166756352 unmapped: 9502720 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:26.262691+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:27.262924+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:28.263063+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:29.263252+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:30.263450+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:31.263601+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:32.263740+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166764544 unmapped: 9494528 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:33.263926+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166772736 unmapped: 9486336 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:34.264079+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166772736 unmapped: 9486336 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:35.264362+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166772736 unmapped: 9486336 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:36.264508+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166772736 unmapped: 9486336 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:37.264729+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166780928 unmapped: 9478144 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:38.264932+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166797312 unmapped: 9461760 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:39.265138+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166797312 unmapped: 9461760 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:40.265317+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166797312 unmapped: 9461760 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:41.265517+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166797312 unmapped: 9461760 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:42.265708+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:43.265852+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:44.266054+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:45.266212+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:46.266394+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:47.266551+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:48.266991+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:49.267336+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166813696 unmapped: 9445376 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:50.267538+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:51.267955+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:52.268124+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:53.268379+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 ms_handle_reset con 0x564b595a0000 session 0x564b59670780
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a1000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 ms_handle_reset con 0x564b5a2cf800 session 0x564b59ff72c0
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: handle_auth_request added challenge on 0x564b595a0000
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:54.268508+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:55.268852+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:56.269103+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:57.269303+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:58.269476+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:59.269622+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:00.269748+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:01.269860+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166821888 unmapped: 9437184 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:02.270017+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166895616 unmapped: 9363456 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'config diff' '{prefix=config diff}'
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'config show' '{prefix=config show}'
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: bluestore.MempoolThread(0x564b561d3b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2530082 data_alloc: 184549376 data_used: 4419584
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'counter dump' '{prefix=counter dump}'
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'counter schema' '{prefix=counter schema}'
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:03.270127+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166535168 unmapped: 9723904 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: osd.5 262 heartbeat osd_stat(store_statfs(0x1b0ae1000/0x0/0x1bfc00000, data 0x5f462a2/0x610d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: tick
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_tickets
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:04.270612+0000)
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: prioritycache tune_memory target: 3561601228 mapped: 166805504 unmapped: 9453568 heap: 176259072 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:35 np0005626463.localdomain ceph-osd[32575]: do_command 'log dump' '{prefix=log dump}'
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.49290 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1664137232' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.49302 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4173607529' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.59146 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2832297312' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.49314 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3951898632' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.69272 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2668187497' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.59161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2485740865' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/90522445' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.49332 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: pgmap v764: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: from='client.69296 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 23 10:10:35 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/431687781' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain crontab[326407]: (root) LIST (root)
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.566 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.567 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.567 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.567 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.605 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:36 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:36.606 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2496655294' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.59182 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2033944952' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/431687781' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.69311 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.49356 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.59194 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3050292705' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1294500577' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/315776036' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.49365 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3272345447' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.
Feb 23 10:10:36 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.
Feb 23 10:10:36 np0005626463.localdomain podman[326514]: 2026-02-23 10:10:36.891044842 +0000 UTC m=+0.064780123 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb 23 10:10:36 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3968225865' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 23 10:10:36 np0005626463.localdomain podman[326515]: 2026-02-23 10:10:36.92009783 +0000 UTC m=+0.089706334 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Feb 23 10:10:36 np0005626463.localdomain podman[326515]: 2026-02-23 10:10:36.955391371 +0000 UTC m=+0.124999855 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 23 10:10:36 np0005626463.localdomain systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully.
Feb 23 10:10:36 np0005626463.localdomain podman[326514]: 2026-02-23 10:10:36.976199634 +0000 UTC m=+0.149934915 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 23 10:10:36 np0005626463.localdomain systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully.
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.69350 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.49386 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3968225865' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/403139963' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.59233 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2928258918' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1237411397' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/930166630' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3266093420' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3994099881' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: pgmap v765: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2807399635' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2694379616' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Feb 23 10:10:37 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/798831580' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:38.102 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1240513113' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain sudo[326709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:10:38 np0005626463.localdomain sudo[326709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:10:38 np0005626463.localdomain sudo[326709]: pam_unix(sudo:session): session closed for user root
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3992318563' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:38 np0005626463.localdomain sudo[326735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 23 10:10:38 np0005626463.localdomain sudo[326735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2560707444' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/212614963' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.49410 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/238421729' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/798831580' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/489840950' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3960534053' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2390595576' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3124370847' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1240513113' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3992318563' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3090533570' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/468771352' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/202647636' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2560707444' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1641398332' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/212614963' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3820535450' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2872439826' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain sudo[326735]: pam_unix(sudo:session): session closed for user root
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb 23 10:10:38 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1467611387' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 23 10:10:38 np0005626463.localdomain sudo[326883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 23 10:10:38 np0005626463.localdomain sudo[326883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:10:38 np0005626463.localdomain sudo[326883]: pam_unix(sudo:session): session closed for user root
Feb 23 10:10:39 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:39.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4266852345' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain sudo[326905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 -- inventory --format=json-pretty --filter-for-batch
Feb 23 10:10:39 np0005626463.localdomain sudo[326905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2621701882' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:39.319065+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91725824 unmapped: 1105920 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:40.319193+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91725824 unmapped: 1105920 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 30
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 31
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:41.319334+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91602944 unmapped: 1228800 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:42.319514+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 32
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:43.319649+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:44.319801+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:45.319946+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:46.320066+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:47.320170+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:48.320311+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:49.320417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:50.320540+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:51.320697+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:52.320905+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:53.321046+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:54.321204+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:55.321348+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:56.321533+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:57.321694+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:58.321857+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:47:59.322040+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:00.322176+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:01.322301+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:02.322485+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:03.322619+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:04.322799+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91340800 unmapped: 1490944 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:05.323005+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 12 from mon.np0005626461 (according to old e12)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 12
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:48:35.633872+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: mon.np0005626461 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] went away
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _reopen_session rank -1
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _add_conns ranks=[1,0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626463 con 0x557959aff400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626466 con 0x55795bd4e400 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _finish_auth 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request con 0x557959aff400 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth method 2
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth already have auth, reseting
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request con 0x55795bd4e400 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth method 2
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth already have auth, reseting
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_done global_id 24226 payload 293
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_hunting 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: found mon.np0005626463
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_auth 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:05.649324+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _reopen_session rank -1
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _add_conns ranks=[0,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626466 con 0x55795bd4e400 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626463 con 0x55795a5efc00 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 ms_handle_reset con 0x557959aff400 session 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request con 0x55795a5efc00 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth method 2
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth already have auth, reseting
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_done global_id 24226 payload 293
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_hunting 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: found mon.np0005626463
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_auth 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:05.663015+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 12 from mon.np0005626463 (according to old e12)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 12
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:48:35.633872+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_config config(7 keys)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: set_mon_vals no callback set
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 32
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:06.323146+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:07.323318+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:08.323476+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 13 from mon.np0005626463 (according to old e13)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 13
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:48:39.200687+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005626466
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:09.323630+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:10.323749+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:11.323909+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:12.324058+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:13.324194+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:14.324342+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:15.324541+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:16.324678+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:17.324841+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:18.325091+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:19.325288+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:20.325439+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:21.325634+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:22.325839+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:23.326017+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:24.326183+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:25.326358+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:26.326503+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:27.326700+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain systemd-journald[47710]: Data hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Feb 23 10:10:39 np0005626463.localdomain systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 23 10:10:39 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:28.326910+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:29.327072+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:30.327147+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:31.327263+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5308 writes, 23K keys, 5308 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5308 writes, 741 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 156 writes, 536 keys, 156 commit groups, 1.0 writes per commit group, ingest: 0.64 MB, 0.00 MB/s
                                                          Interval WAL: 156 writes, 62 syncs, 2.52 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:32.327496+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:33.327670+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:34.327818+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:35.328028+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:36.328269+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:37.328432+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:38.328592+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:39.328771+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:40.328970+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _reopen_session rank -1
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _add_conns ranks=[0,2,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626466 con 0x55795bd4e400 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626465 con 0x557959aff400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): picked mon.np0005626463 con 0x55795bb68c00 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): start opening mon connection
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 ms_handle_reset con 0x55795a5efc00 session 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request con 0x55795bb68c00 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth method 2
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): _init_auth already have auth, reseting
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient(hunting): handle_auth_done global_id 24226 payload 293
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_hunting 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: found mon.np0005626463
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_auth 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:41.078515+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 14 from mon.np0005626463 (according to old e14)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 14
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:49:10.990173+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_config config(7 keys)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: set_mon_vals no callback set
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 32
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 14 from mon.np0005626463 (according to old e14)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 14
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:49:10.990173+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:41.329141+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:42.329309+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain rsyslogd[758]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:43.329502+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:44.329960+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:45.330086+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:46.330225+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:47.330413+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:48.330584+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:49.330734+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:50.330978+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:51.331132+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:52.331335+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:53.331486+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:54.331667+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:55.331857+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:56.332061+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_monmap mon_map magic: 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient:  got monmap 15 from mon.np0005626463 (according to old e15)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: dump:
                                                          epoch 15
                                                          fsid f1fea371-cb69-578d-a3d0-b5c472a84b46
                                                          last_changed 2026-02-23T09:49:26.924061+0000
                                                          created 2026-02-23T07:36:01.997603+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:57.332249+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:58.332436+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:48:59.332610+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:00.332790+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:01.332934+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:02.333121+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:03.333316+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:04.333510+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:05.333708+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:06.333857+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91537408 unmapped: 1294336 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:07.337765+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 835126 data_alloc: 184549376 data_used: 8830976
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 heartbeat osd_stat(store_statfs(0x1b9db0000/0x0/0x1bfc00000, data 0x1c6459f/0x1cdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 90.164848328s of 90.185874939s, submitted: 6
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91586560 unmapped: 1245184 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:08.337923+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91586560 unmapped: 1245184 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:09.338073+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91586560 unmapped: 1245184 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:10.338255+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 33
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1055095676,v1:172.18.0.106:6811/1055095676]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 34
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now 
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/1055095676
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect No active mgr available yet
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 ms_handle_reset con 0x55795bb68000 session 0x55795a223860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91734016 unmapped: 1097728 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 handle_osd_map epochs [85,85], i have 85, src has [1,85]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:11.338409+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 35
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: get_auth_request con 0x55795a5efc00 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_configure stats_period=5
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91824128 unmapped: 1007616 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:12.338592+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 36
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91824128 unmapped: 1007616 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:13.338785+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91824128 unmapped: 1007616 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:14.338965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 37
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:15.339184+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:16.339390+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 38
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:17.339543+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:18.339708+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:19.339842+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:20.339945+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:21.340094+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:22.340610+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:23.340768+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:24.340912+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:25.341053+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:26.341346+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:27.341790+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:28.341963+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:29.342104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:30.342234+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:31.342366+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:32.342540+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:33.342918+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:34.343370+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:35.343693+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91971584 unmapped: 860160 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 39
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/1471406,v1:172.18.0.108:6811/1471406]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:36.343846+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:37.344035+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:38.344272+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:39.344533+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:40.344757+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:41.344969+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:42.345242+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:43.345398+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:44.345578+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:45.345737+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:46.345931+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:47.346130+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:48.346281+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:49.346451+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:50.346614+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:51.346754+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:52.346922+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:53.347041+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:54.347213+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:55.347344+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:56.347530+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:57.347667+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:58.347842+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:49:59.347997+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:00.348142+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:01.348312+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:02.348554+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:03.348721+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:04.348935+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:05.349116+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:06.349330+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:07.349505+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:08.349629+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:09.349822+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:10.350001+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:11.350149+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91979776 unmapped: 851968 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:12.350346+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:13.350503+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:14.350626+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:15.350765+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:16.350953+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:17.351143+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:18.351285+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:19.351432+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:20.351530+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:21.351724+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:22.351965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:23.352171+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:24.352380+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:25.352536+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:26.352693+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:27.352959+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:28.353576+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:29.353711+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:30.354053+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:31.354212+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 heartbeat osd_stat(store_statfs(0x1b9dac000/0x0/0x1bfc00000, data 0x1c66bfd/0x1ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:32.354376+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 838062 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:33.354587+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:34.354709+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91996160 unmapped: 835584 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now 
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/1471406
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect No active mgr available yet
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 86.898468018s of 86.925666809s, submitted: 6
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 ms_handle_reset con 0x55795bd4ec00 session 0x557959ab6f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959aff400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:35.354864+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91742208 unmapped: 1089536 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 41
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: get_auth_request con 0x55795a602400 auth_method 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_configure stats_period=5
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:36.355040+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91742208 unmapped: 1089536 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 42
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:37.355227+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91742208 unmapped: 1089536 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:38.355344+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91742208 unmapped: 1089536 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:39.355455+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 43
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91742208 unmapped: 1089536 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:40.355608+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 44
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:41.355759+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:42.356061+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:43.356219+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:44.356360+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:45.356555+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:46.356797+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:47.356985+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:48.357201+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:49.357349+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:50.357530+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:51.357710+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:52.357955+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:53.358126+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:54.358335+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:55.358478+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:56.358669+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:57.358818+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:58.359007+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:50:59.359202+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:00.359374+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:01.359560+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:02.359782+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:03.359969+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:04.360159+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:05.360352+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:06.360560+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:07.360738+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:08.360937+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:09.361108+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:10.361276+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:11.361462+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:12.361680+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:13.361909+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:14.362052+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:15.362242+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:16.362424+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:17.362561+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:18.362760+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:19.362957+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:20.363156+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:21.363370+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:22.363680+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:23.363902+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:24.364100+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:25.364293+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:26.364482+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:27.364685+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:28.364837+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:29.365010+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:30.365154+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:31.365343+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:32.365528+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:33.365662+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:34.365906+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:35.366069+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:36.366244+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:37.366382+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:38.366526+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:39.366723+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:40.366949+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:41.367160+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:42.367335+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:43.367506+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:44.367669+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:45.367848+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:46.368066+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:47.368269+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:48.368422+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:49.368577+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:50.368703+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:51.368918+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:52.369146+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:53.369317+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:54.369503+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:55.369649+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:56.369786+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:57.369995+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:58.370198+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:51:59.370350+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:00.370544+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:01.370712+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:02.370947+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 heartbeat osd_stat(store_statfs(0x1b9da8000/0x0/0x1bfc00000, data 0x1c69409/0x1ce5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 942080 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 841270 data_alloc: 184549376 data_used: 8839168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:03.371116+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 88.493705750s of 88.513702393s, submitted: 6
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91906048 unmapped: 925696 heap: 92831744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:04.371266+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91299840 unmapped: 10846208 heap: 102146048 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 87 ms_handle_reset con 0x55795bb69800 session 0x55795a21ed20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:05.371409+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a304c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91430912 unmapped: 17006592 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:06.371579+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91414528 unmapped: 17022976 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 45
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:07.371687+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795a304c00 session 0x55795a5b45a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84c2000/0x0/0x1bfc00000, data 0x354b8f8/0x35cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:08.371852+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:09.372064+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:10.372327+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:11.372519+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:12.372739+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:13.372999+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:14.373329+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:15.373509+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:16.373712+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:17.373945+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:18.374753+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:19.374974+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:20.375152+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:21.375358+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:22.375578+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:23.375762+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:24.375950+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:25.376130+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:26.376285+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:27.376540+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:28.376744+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:29.376960+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:30.377168+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:31.377400+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:32.377674+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:33.377919+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:34.378147+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:35.378337+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:36.378483+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:37.378643+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:38.378776+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:39.378939+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:40.379126+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:41.379277+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:42.379457+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:43.379783+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:44.379946+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:45.380026+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:46.380168+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:47.380298+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:48.380439+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:49.380607+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:50.380760+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:51.380932+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:52.381095+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:53.381225+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:54.381322+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:55.381529+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795acc5800 session 0x55795a21f4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:56.381671+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb04c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795bb04c00 session 0x55795a21e5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x557958fd8800 session 0x5579584283c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:57.381846+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795a5ee800 session 0x557958428000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 91267072 unmapped: 17170432 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:58.382026+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1025005 data_alloc: 184549376 data_used: 8851456
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x557959afe000 session 0x55795a2870e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x557958fd8800 session 0x5579595ae3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 95928320 unmapped: 12509184 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.567615509s of 55.746898651s, submitted: 25
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:52:59.382180+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b84bd000/0x0/0x1bfc00000, data 0x354dc9a/0x35cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795a5ee800 session 0x5579595af2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 96067584 unmapped: 12369920 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:00.382363+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 96067584 unmapped: 12369920 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aa15400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795aa15400 session 0x557959555c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:01.382493+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 96067584 unmapped: 12369920 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aa14000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x55795aa14000 session 0x557958302780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:02.382649+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 96165888 unmapped: 12271616 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:03.382817+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1123348 data_alloc: 184549376 data_used: 13504512
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f58000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x557958f58000 session 0x557958302960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f58000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 ms_handle_reset con 0x557958f58000 session 0x55795a609e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 96124928 unmapped: 12312576 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b7b53000/0x0/0x1bfc00000, data 0x3eb7cba/0x3f3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:04.382990+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97001472 unmapped: 11436032 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:05.383169+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:06.383354+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:07.383529+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b7b53000/0x0/0x1bfc00000, data 0x3eb7cba/0x3f3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:08.383649+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1134282 data_alloc: 184549376 data_used: 14565376
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:09.383958+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:10.384108+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:11.384285+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97034240 unmapped: 11403264 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:12.384457+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b7b53000/0x0/0x1bfc00000, data 0x3eb7cba/0x3f3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f58c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.537302971s of 13.788743019s, submitted: 40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 97058816 unmapped: 11378688 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:13.384583+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1139564 data_alloc: 184549376 data_used: 14565376
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b4400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 103497728 unmapped: 4939776 heap: 108437504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 88 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 ms_handle_reset con 0x557958f58c00 session 0x55795a5b05a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:14.385688+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 ms_handle_reset con 0x55795a9b4400 session 0x557958428000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106184704 unmapped: 3596288 heap: 109780992 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b68eb000/0x0/0x1bfc00000, data 0x5112e5a/0x519c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:15.385821+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 105586688 unmapped: 5242880 heap: 110829568 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:16.385963+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 90 ms_handle_reset con 0x55795aca5000 session 0x55795a21fa40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 90 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 90 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 105193472 unmapped: 5636096 heap: 110829568 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:17.386101+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 105193472 unmapped: 5636096 heap: 110829568 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:18.386240+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1305016 data_alloc: 184549376 data_used: 14819328
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 handle_osd_map epochs [91,91], i have 91, src has [1,91]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b680d000/0x0/0x1bfc00000, data 0x51f2c6b/0x527d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 ms_handle_reset con 0x557958fd8800 session 0x5579595590e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 ms_handle_reset con 0x55795a5ee800 session 0x55795a2225a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 ms_handle_reset con 0x557957ff0800 session 0x5579595aeb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106070016 unmapped: 4759552 heap: 110829568 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:19.386399+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b680e000/0x0/0x1bfc00000, data 0x51f2848/0x527b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106250240 unmapped: 4579328 heap: 110829568 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:20.386558+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 92 ms_handle_reset con 0x55795a305000 session 0x55795a222960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 92 ms_handle_reset con 0x55795a9b4000 session 0x55795a609860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898d400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 113369088 unmapped: 4808704 heap: 118177792 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:21.386689+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 92 ms_handle_reset con 0x55795898d400 session 0x55795a608d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122617856 unmapped: 7077888 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:22.386858+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 93 ms_handle_reset con 0x557957ff0800 session 0x557958738f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b58ab000/0x0/0x1bfc00000, data 0x615408a/0x61e2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122568704 unmapped: 7127040 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:23.387044+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1481069 data_alloc: 201326592 data_used: 24006656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.194706917s of 10.760626793s, submitted: 411
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 94 ms_handle_reset con 0x55795a305000 session 0x557959b16000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 6045696 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:24.387207+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 6045696 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:25.387399+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123781120 unmapped: 5914624 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:26.387499+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 ms_handle_reset con 0x55795a9b5c00 session 0x55795a632d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 22773760 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:27.387656+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 heartbeat osd_stat(store_statfs(0x1b751e000/0x0/0x1bfc00000, data 0x44de6f8/0x456f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 22773760 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:28.387804+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1215500 data_alloc: 184549376 data_used: 13553664
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 ms_handle_reset con 0x55795adb9c00 session 0x557959b17a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 ms_handle_reset con 0x55795bd4d000 session 0x557959b165a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 ms_handle_reset con 0x557957ff0800 session 0x557959b163c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 22773760 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:29.387951+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 22773760 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:30.388136+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106921984 unmapped: 22773760 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:31.772511+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x5579595ca000 session 0x557959b17e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795bb69000 session 0x557958303e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets getting new tickets!
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:32.772726+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _finish_auth 0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:32.773550+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 106962944 unmapped: 22732800 heap: 129695744 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x5579595cc5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795a305800 session 0x5579595cc960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x55795a223a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x5579595ca000 session 0x55795a223860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 heartbeat osd_stat(store_statfs(0x1b753f000/0x0/0x1bfc00000, data 0x44bc970/0x454e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x557957ff0800 session 0x55795a2225a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c000 session 0x557958429a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795bd3ec00 session 0x5579595aeb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795bb69000 session 0x55795a222d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1318549 data_alloc: 184549376 data_used: 13557760
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x557957ff0800 session 0x55795a5b05a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c000 session 0x557958302780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x5579595ca000 session 0x55795a250f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x557958302960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:33.772906+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 105766912 unmapped: 28131328 heap: 133898240 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x557957ff0800 session 0x55795a223a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c000 session 0x557957c21e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x55795a223860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 heartbeat osd_stat(store_statfs(0x1b69f2000/0x0/0x1bfc00000, data 0x50069f2/0x509b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.774333954s of 10.210538864s, submitted: 123
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795bb69000 session 0x557957c203c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x5579595ca000 session 0x55795a2225a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:34.773057+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 114434048 unmapped: 23314432 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x557957ff0800 session 0x557959aec960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c000 session 0x55795a6374a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:35.773241+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 114434048 unmapped: 23314432 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x557959b17680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x5579595ca000 session 0x55795a609860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795bb69000 session 0x55795a222960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c000 session 0x557957c20b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 ms_handle_reset con 0x55795898c800 session 0x557957c20000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:36.773429+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 114515968 unmapped: 23232512 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ef800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca5400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 ms_handle_reset con 0x5579595ca000 session 0x557959b17e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b585b000/0x0/0x1bfc00000, data 0x619aa58/0x6233000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:37.773584+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 27238400 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1375995 data_alloc: 184549376 data_used: 14041088
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:38.773746+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 27238400 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:39.773925+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 27238400 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:40.774087+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 27238400 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b67aa000/0x0/0x1bfc00000, data 0x523fddc/0x52d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:41.774249+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 27238400 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:42.774415+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110796800 unmapped: 26951680 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1389595 data_alloc: 184549376 data_used: 15601664
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:43.774565+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110796800 unmapped: 26951680 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:44.774728+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 110796800 unmapped: 26951680 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b67aa000/0x0/0x1bfc00000, data 0x523fddc/0x52d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.064633369s of 11.491760254s, submitted: 83
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:45.774866+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 108118016 unmapped: 29630464 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:46.775035+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 108658688 unmapped: 29089792 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:47.775179+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 108724224 unmapped: 29024256 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 98 heartbeat osd_stat(store_statfs(0x1b67b1000/0x0/0x1bfc00000, data 0x5242064/0x52dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 98 heartbeat osd_stat(store_statfs(0x1b67b1000/0x0/0x1bfc00000, data 0x5242064/0x52dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1399231 data_alloc: 184549376 data_used: 16719872
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:48.775330+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 109223936 unmapped: 28524544 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:49.775525+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 109223936 unmapped: 28524544 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 98 ms_handle_reset con 0x55795a5ef800 session 0x557958313c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 98 ms_handle_reset con 0x557957ff0800 session 0x557959b16b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:50.775713+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 109297664 unmapped: 28450816 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:51.775918+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 111427584 unmapped: 26320896 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 ms_handle_reset con 0x55795a305000 session 0x55795bd463c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 heartbeat osd_stat(store_statfs(0x1b67ae000/0x0/0x1bfc00000, data 0x5246064/0x52e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad11000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 ms_handle_reset con 0x55795ad11000 session 0x55795a636780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 ms_handle_reset con 0x55795ad10000 session 0x5579587383c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:52.776076+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4e000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 119267328 unmapped: 18481152 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1537843 data_alloc: 201326592 data_used: 24776704
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:53.776177+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123150336 unmapped: 14598144 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aa14000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 ms_handle_reset con 0x55795bd4e000 session 0x55795a5b54a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:54.776292+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 127434752 unmapped: 10313728 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 heartbeat osd_stat(store_statfs(0x1b56e5000/0x0/0x1bfc00000, data 0x630c468/0x63a9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:55.776468+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 127541248 unmapped: 10207232 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.657592773s of 10.657130241s, submitted: 248
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 ms_handle_reset con 0x557959afe800 session 0x557958302960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:56.776634+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123789312 unmapped: 13959168 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 ms_handle_reset con 0x55795aca5400 session 0x557958302780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:57.776786+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123920384 unmapped: 13828096 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:58.889680+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1574749 data_alloc: 201326592 data_used: 24870912
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 ms_handle_reset con 0x55795a5ee400 session 0x55795a58a000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123936768 unmapped: 13811712 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 ms_handle_reset con 0x557959afe800 session 0x55795a21e000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b56b8000/0x0/0x1bfc00000, data 0x633785e/0x63d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 ms_handle_reset con 0x55795a9b5800 session 0x55795a6323c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 ms_handle_reset con 0x55795ad10400 session 0x55795bd47a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 ms_handle_reset con 0x55795ad10400 session 0x557958302b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:53:59.889799+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124026880 unmapped: 13721600 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:00.889927+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 ms_handle_reset con 0x557959afe800 session 0x557958429860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 117940224 unmapped: 19808256 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:01.890060+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 117940224 unmapped: 19808256 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:02.890200+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 117948416 unmapped: 19800064 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:03.890633+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1402302 data_alloc: 184549376 data_used: 16138240
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 117948416 unmapped: 19800064 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:04.890747+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 heartbeat osd_stat(store_statfs(0x1b6848000/0x0/0x1bfc00000, data 0x51a6e61/0x5246000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 117948416 unmapped: 19800064 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795adb9c00 session 0x55795a635e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8a800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bf8a800 session 0x55795a5b4780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795a305c00 session 0x55795a21f860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bde3c00 session 0x55795a637c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795aa14000 session 0x55795d1221e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795a305c00 session 0x557959aede00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795ad10400 session 0x55795d1223c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x557959afe800 session 0x5579595cc1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795a305c00 session 0x55795958a960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aa14000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795aa14000 session 0x557959ab6000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795ad10400 session 0x55795a21e3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:05.890931+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bde3c00 session 0x55795d37b680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115916800 unmapped: 21831680 heap: 137748480 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8a800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.351548195s of 10.013216972s, submitted: 175
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795adb9c00 session 0x557959aed0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bf8a800 session 0x55795d37a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:06.891074+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 29851648 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8b000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bf8b000 session 0x55795a609a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bd4d800 session 0x55795bd47680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:07.891221+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123002880 unmapped: 29851648 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x55795bde3000 session 0x55795d37a1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957a9c400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 ms_handle_reset con 0x557957a9c400 session 0x557958302780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:08.891358+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1434028 data_alloc: 184549376 data_used: 14409728
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122970112 unmapped: 29884416 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8a800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 ms_handle_reset con 0x55795bde3000 session 0x55795d37ab40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:09.891489+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122527744 unmapped: 30326784 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 ms_handle_reset con 0x55795bf8a800 session 0x55795a634000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 ms_handle_reset con 0x55795bd4d800 session 0x55795bd47a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8b000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 heartbeat osd_stat(store_statfs(0x1b6b3b000/0x0/0x1bfc00000, data 0x4eac1f5/0x4f4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:10.891610+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115089408 unmapped: 37765120 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 ms_handle_reset con 0x55795bf8b000 session 0x55795a6323c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:11.891776+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115097600 unmapped: 37756928 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:12.891979+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115097600 unmapped: 37756928 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:13.892143+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1186035 data_alloc: 184549376 data_used: 13242368
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115097600 unmapped: 37756928 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:14.892281+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb04800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115105792 unmapped: 37748736 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 104 ms_handle_reset con 0x55795bb04800 session 0x5579595cc5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb04800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:15.892420+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115138560 unmapped: 37715968 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.220032692s of 10.043045998s, submitted: 177
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 105 ms_handle_reset con 0x55795bb04800 session 0x557959559e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 105 heartbeat osd_stat(store_statfs(0x1b807c000/0x0/0x1bfc00000, data 0x3571d31/0x3611000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:16.892553+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115171328 unmapped: 37683200 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b8079000/0x0/0x1bfc00000, data 0x3573914/0x3613000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:17.892718+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115171328 unmapped: 37683200 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b8074000/0x0/0x1bfc00000, data 0x3575bb8/0x3617000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:18.892855+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1199471 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115171328 unmapped: 37683200 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:19.893004+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115179520 unmapped: 37675008 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:20.893132+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115179520 unmapped: 37675008 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:21.893255+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115179520 unmapped: 37675008 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:22.893414+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115253248 unmapped: 37601280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:23.893569+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1201625 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115253248 unmapped: 37601280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b8072000/0x0/0x1bfc00000, data 0x3577e40/0x361b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:24.893743+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115253248 unmapped: 37601280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:25.893994+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115253248 unmapped: 37601280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:26.894304+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115253248 unmapped: 37601280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b8072000/0x0/0x1bfc00000, data 0x3577e40/0x361b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:27.894449+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:28.894604+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1201625 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:29.895020+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:30.895221+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:31.895375+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:32.895562+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b8072000/0x0/0x1bfc00000, data 0x3577e40/0x361b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:33.895767+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1201625 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:34.895930+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115269632 unmapped: 37584896 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:35.896063+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115277824 unmapped: 37576704 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:36.896195+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115277824 unmapped: 37576704 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:37.896384+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115277824 unmapped: 37576704 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b8072000/0x0/0x1bfc00000, data 0x3577e40/0x361b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:38.896532+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1201625 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115277824 unmapped: 37576704 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aa15400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.160015106s of 23.390987396s, submitted: 80
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:39.896691+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b7de2000/0x0/0x1bfc00000, data 0x3806e79/0x38ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 ms_handle_reset con 0x55795aa15400 session 0x557959ab6d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 36773888 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 ms_handle_reset con 0x5579595cbc00 session 0x55795cca0b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:40.896927+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 36773888 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:41.897079+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 36773888 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:42.899953+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 36773888 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b7bd8000/0x0/0x1bfc00000, data 0x3a10eb2/0x3ab6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:43.900062+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b7bd8000/0x0/0x1bfc00000, data 0x3a10eb2/0x3ab6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1253744 data_alloc: 184549376 data_used: 13254656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116146176 unmapped: 36708352 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:44.900155+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116146176 unmapped: 36708352 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:45.900278+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116146176 unmapped: 36708352 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 ms_handle_reset con 0x55795bde3400 session 0x55795cca0d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:46.900418+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 116187136 unmapped: 36667392 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f4400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:47.900561+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b7bd7000/0x0/0x1bfc00000, data 0x3a10ed5/0x3ab7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:48.900692+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1263941 data_alloc: 184549376 data_used: 14331904
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:49.900824+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:50.900953+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:51.901089+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:52.901248+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115490816 unmapped: 37363712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b7bd7000/0x0/0x1bfc00000, data 0x3a10ed5/0x3ab7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:53.901412+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1263941 data_alloc: 184549376 data_used: 14331904
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115507200 unmapped: 37347328 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:54.901538+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 115507200 unmapped: 37347328 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:55.901668+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.147470474s of 16.350759506s, submitted: 44
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122183680 unmapped: 30670848 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:56.901786+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123191296 unmapped: 29663232 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:57.901930+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b6d37000/0x0/0x1bfc00000, data 0x48a2ed5/0x4949000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123240448 unmapped: 29614080 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b6d24000/0x0/0x1bfc00000, data 0x48c3ed5/0x496a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:58.902055+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1390825 data_alloc: 184549376 data_used: 14229504
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121282560 unmapped: 31571968 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:54:59.902152+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121282560 unmapped: 31571968 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:00.902279+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121282560 unmapped: 31571968 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:01.902417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121298944 unmapped: 31555584 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:02.902610+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 108 ms_handle_reset con 0x5579595cb400 session 0x55795d37ad20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122355712 unmapped: 30498816 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:03.902749+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 108 heartbeat osd_stat(store_statfs(0x1b6cfe000/0x0/0x1bfc00000, data 0x48e7277/0x498f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1389084 data_alloc: 184549376 data_used: 14249984
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122355712 unmapped: 30498816 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:04.902935+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 109 ms_handle_reset con 0x5579595cb400 session 0x55795d37a960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122372096 unmapped: 30482432 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 109 ms_handle_reset con 0x55795a305000 session 0x55795d43de00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 109 ms_handle_reset con 0x557958fd9c00 session 0x55795d43c1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:05.903096+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.468571663s of 10.219685555s, submitted: 205
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122445824 unmapped: 30408704 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 110 ms_handle_reset con 0x5579595cbc00 session 0x55795d43d2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb05800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:06.903240+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122486784 unmapped: 30367744 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 110 ms_handle_reset con 0x55795d3f4400 session 0x55795cca10e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:07.903409+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122544128 unmapped: 30310400 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 111 ms_handle_reset con 0x55795bb05800 session 0x557959ab7680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 111 ms_handle_reset con 0x557958fd9c00 session 0x55795a5b14a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:08.903554+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1232911 data_alloc: 184549376 data_used: 11870208
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 111 heartbeat osd_stat(store_statfs(0x1b8060000/0x0/0x1bfc00000, data 0x3580d93/0x362c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121954304 unmapped: 30900224 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:09.903697+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 ms_handle_reset con 0x5579595cb400 session 0x5579595aeb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121978880 unmapped: 30875648 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:10.903842+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 ms_handle_reset con 0x55795a305000 session 0x55795a251a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 ms_handle_reset con 0x55795ad10000 session 0x5579595554a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 112 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 ms_handle_reset con 0x557958fd9c00 session 0x557959ab74a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 ms_handle_reset con 0x5579595cbc00 session 0x5579595af4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122036224 unmapped: 30818304 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 ms_handle_reset con 0x5579595cb400 session 0x55795a5b1a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb05800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:11.904011+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 113 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 114 ms_handle_reset con 0x55795a305000 session 0x55795a5b1860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 114 heartbeat osd_stat(store_statfs(0x1b7d87000/0x0/0x1bfc00000, data 0x3856ce4/0x3906000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121724928 unmapped: 31129600 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:12.904147+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 115 ms_handle_reset con 0x55795bb05800 session 0x557959ab6780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121782272 unmapped: 31072256 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:13.904300+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 116 ms_handle_reset con 0x557958fd9c00 session 0x55795a5b1e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1357472 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121798656 unmapped: 31055872 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:14.904440+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 117 ms_handle_reset con 0x5579595cb400 session 0x55795a632b40
Feb 23 10:10:39 np0005626463.localdomain podman[242954]: time="2026-02-23T10:10:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121856000 unmapped: 30998528 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 117 heartbeat osd_stat(store_statfs(0x1b7573000/0x0/0x1bfc00000, data 0x4060e91/0x411a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:15.904584+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 118 ms_handle_reset con 0x5579595cbc00 session 0x55795d3932c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.136994362s of 10.120767593s, submitted: 251
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 121896960 unmapped: 30957568 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 118 ms_handle_reset con 0x55795a305000 session 0x55795d3934a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:16.904745+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122068992 unmapped: 30785536 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb68000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:17.904896+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 119 heartbeat osd_stat(store_statfs(0x1b7540000/0x0/0x1bfc00000, data 0x408c22e/0x414d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 119 handle_osd_map epochs [119,120], i have 120, src has [1,120]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122175488 unmapped: 30679040 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 120 ms_handle_reset con 0x557958fd9000 session 0x55795d393860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:18.905089+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1407532 data_alloc: 184549376 data_used: 13860864
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122167296 unmapped: 30687232 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 121 ms_handle_reset con 0x557958fd9c00 session 0x5579587c0b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:19.905234+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122314752 unmapped: 30539776 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 122 ms_handle_reset con 0x5579595cb400 session 0x55795ef4d0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:20.905365+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122363904 unmapped: 30490624 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 123 ms_handle_reset con 0x5579595cbc00 session 0x557958312960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f5c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:21.905530+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122413056 unmapped: 30441472 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:22.906039+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122454016 unmapped: 30400512 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 124 ms_handle_reset con 0x55795d3f5c00 session 0x55795ae1fe00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4cc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 124 heartbeat osd_stat(store_statfs(0x1b7534000/0x0/0x1bfc00000, data 0x40964e9/0x4159000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:23.906197+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 124 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1412161 data_alloc: 184549376 data_used: 13873152
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122486784 unmapped: 30367744 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 125 ms_handle_reset con 0x55795bd4cc00 session 0x55795a6092c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:24.906366+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122486784 unmapped: 30367744 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:25.906509+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122527744 unmapped: 30326784 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 125 ms_handle_reset con 0x55795bb68000 session 0x55795cca14a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.225192070s of 10.296829224s, submitted: 259
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:26.906643+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122527744 unmapped: 30326784 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:27.906769+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b7532000/0x0/0x1bfc00000, data 0x40980f7/0x415c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3fc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 26394624 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:28.906921+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b6c24000/0x0/0x1bfc00000, data 0x49a439b/0x4a6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1478453 data_alloc: 184549376 data_used: 13889536
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126140416 unmapped: 26714112 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b6c24000/0x0/0x1bfc00000, data 0x49a439b/0x4a6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:29.907059+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125583360 unmapped: 27271168 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:30.907222+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125558784 unmapped: 27295744 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:31.907383+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126033920 unmapped: 26820608 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:32.907558+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125353984 unmapped: 27500544 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:33.907694+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1494353 data_alloc: 184549376 data_used: 13901824
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125353984 unmapped: 27500544 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:34.907818+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125353984 unmapped: 27500544 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b6b93000/0x0/0x1bfc00000, data 0x4a32623/0x4afa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:35.907945+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125493248 unmapped: 27361280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b6b73000/0x0/0x1bfc00000, data 0x4a53623/0x4b1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:36.908078+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125493248 unmapped: 27361280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:37.908232+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b6b73000/0x0/0x1bfc00000, data 0x4a53623/0x4b1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 ms_handle_reset con 0x55795bd3fc00 session 0x557959555e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.177900314s of 11.654159546s, submitted: 145
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125493248 unmapped: 27361280 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1ccc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:38.908359+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 ms_handle_reset con 0x55795d1ccc00 session 0x55795a5b45a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:39.908496+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:40.908645+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:41.908788+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:42.908965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:43.909103+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:44.909272+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:45.909409+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:46.909606+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:47.909754+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:48.909951+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:49.910083+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:50.910233+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:51.910351+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:52.910543+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:53.910679+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123117568 unmapped: 29736960 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:54.910840+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:55.910973+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:56.911102+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:57.911266+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:58.911395+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:55:59.911538+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:00.911658+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:01.911779+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122634240 unmapped: 30220288 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:02.911965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:03.912111+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:04.912243+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:05.912382+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:06.912550+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:07.912693+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:08.912839+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:09.914394+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:10.914572+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:11.914711+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:12.914923+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:13.915075+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:14.915248+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:15.915395+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:16.915541+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:17.915675+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:18.915858+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:19.916079+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:20.916249+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:21.916385+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:22.916557+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:23.916704+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:24.916834+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:25.916947+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:26.917110+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:27.917262+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:28.917422+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:29.917601+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:30.917746+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:31.917937+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:32.918153+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122650624 unmapped: 30203904 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:33.918322+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:34.918466+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:35.918621+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:36.918811+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:37.918981+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:38.919145+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:39.919290+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:40.919417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:41.919565+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:42.919997+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:43.920193+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:44.920638+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:45.920813+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:46.920969+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:47.921114+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:48.921267+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:49.921386+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:50.921524+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:51.921669+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4600/0x366b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:52.921894+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:53.922048+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1311993 data_alloc: 184549376 data_used: 10752000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122658816 unmapped: 30195712 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 76.240913391s of 76.426788330s, submitted: 45
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:54.922212+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 ms_handle_reset con 0x55795ba70800 session 0x557959ab7a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b8022000/0x0/0x1bfc00000, data 0x35a4623/0x366c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122691584 unmapped: 30162944 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:55.922362+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b801d000/0x0/0x1bfc00000, data 0x35a69c5/0x3670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122691584 unmapped: 30162944 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:56.922511+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b801d000/0x0/0x1bfc00000, data 0x35a69c5/0x3670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122691584 unmapped: 30162944 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca5800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:57.922603+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122699776 unmapped: 30154752 heap: 152854528 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:58.922752+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1376460 data_alloc: 184549376 data_used: 10764288
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122716160 unmapped: 38535168 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:56:59.922899+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122716160 unmapped: 38535168 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:00.923044+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 131162112 unmapped: 30089216 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b701d000/0x0/0x1bfc00000, data 0x45a69d5/0x4671000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:01.923186+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122781696 unmapped: 38469632 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:02.923327+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122781696 unmapped: 38469632 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:03.923470+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 129 ms_handle_reset con 0x55795aca5800 session 0x55795a58ba40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1599830 data_alloc: 184549376 data_used: 10776576
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122789888 unmapped: 38461440 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:04.923563+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.177377701s of 10.422365189s, submitted: 38
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122863616 unmapped: 38387712 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 130 handle_osd_map epochs [129,130], i have 130, src has [1,130]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 130 ms_handle_reset con 0x55795ba70800 session 0x55795d1230e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:05.923686+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 130 heartbeat osd_stat(store_statfs(0x1b5818000/0x0/0x1bfc00000, data 0x5da8d77/0x5e75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb68000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122929152 unmapped: 38322176 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 131 ms_handle_reset con 0x55795bb68000 session 0x55795d122d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:06.923783+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:07.923949+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:08.924111+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1336531 data_alloc: 184549376 data_used: 10788864
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:09.924269+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:10.924396+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 131 heartbeat osd_stat(store_statfs(0x1b8010000/0x0/0x1bfc00000, data 0x35ad530/0x367b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:11.924597+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122920960 unmapped: 38330368 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:12.924767+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 122929152 unmapped: 38322176 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:13.924929+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1342343 data_alloc: 184549376 data_used: 10788864
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 123994112 unmapped: 37257216 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 133 heartbeat osd_stat(store_statfs(0x1b800e000/0x0/0x1bfc00000, data 0x35af7b8/0x367f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:14.925124+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 133 heartbeat osd_stat(store_statfs(0x1b8009000/0x0/0x1bfc00000, data 0x35b1b5a/0x3683000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.742342949s of 10.217072487s, submitted: 123
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 37249024 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:15.925268+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124002304 unmapped: 37249024 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:16.925451+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 135 handle_osd_map epochs [134,135], i have 135, src has [1,135]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124043264 unmapped: 37208064 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:17.925604+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 135 ms_handle_reset con 0x55795a602800 session 0x55795d123c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124059648 unmapped: 37191680 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 135 heartbeat osd_stat(store_statfs(0x1b8001000/0x0/0x1bfc00000, data 0x35b6370/0x368c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:18.925750+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1355674 data_alloc: 184549376 data_used: 10797056
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 37117952 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:19.925917+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 136 heartbeat osd_stat(store_statfs(0x1b7fff000/0x0/0x1bfc00000, data 0x35b8714/0x368f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124116992 unmapped: 37134336 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:20.926106+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124116992 unmapped: 37134336 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:21.926245+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 37117952 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:22.926400+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124133376 unmapped: 37117952 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:23.926555+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 137 heartbeat osd_stat(store_statfs(0x1b7ffa000/0x0/0x1bfc00000, data 0x35ba9b8/0x3693000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1356838 data_alloc: 184549376 data_used: 10813440
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124149760 unmapped: 37101568 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:24.926692+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 137 heartbeat osd_stat(store_statfs(0x1b7ffa000/0x0/0x1bfc00000, data 0x35ba9b8/0x3693000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124149760 unmapped: 37101568 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:25.926863+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.434501648s of 10.778075218s, submitted: 79
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 37093376 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:26.927348+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124157952 unmapped: 37093376 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:27.927519+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 ms_handle_reset con 0x55795bd4d800 session 0x55795d1225a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124166144 unmapped: 37085184 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:28.927942+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 ms_handle_reset con 0x5579585a0c00 session 0x55795cca12c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1365580 data_alloc: 184549376 data_used: 10825728
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 heartbeat osd_stat(store_statfs(0x1b7ff4000/0x0/0x1bfc00000, data 0x35bcd97/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124207104 unmapped: 37044224 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 ms_handle_reset con 0x55795a602800 session 0x557958303c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:29.928114+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124223488 unmapped: 37027840 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:30.928503+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124223488 unmapped: 37027840 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:31.928859+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124223488 unmapped: 37027840 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:32.929100+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b7ff7000/0x0/0x1bfc00000, data 0x35bcd76/0x3697000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:33.929417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1367749 data_alloc: 184549376 data_used: 10838016
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:34.929710+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:35.929906+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:36.930155+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:37.930327+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 139 heartbeat osd_stat(store_statfs(0x1b7ff2000/0x0/0x1bfc00000, data 0x35beffe/0x369b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:38.930565+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad11800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.712350845s of 12.917268753s, submitted: 55
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 139 ms_handle_reset con 0x55795ad11800 session 0x55795a287a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1370481 data_alloc: 184549376 data_used: 10838016
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:39.930710+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb04800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 ms_handle_reset con 0x55795bb04800 session 0x55795bd474a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 ms_handle_reset con 0x55795d3f4000 session 0x55795a2512c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:40.930859+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8bc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 ms_handle_reset con 0x55795bf8bc00 session 0x557957c21e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:41.931012+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 heartbeat osd_stat(store_statfs(0x1b7fed000/0x0/0x1bfc00000, data 0x35c1413/0x36a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 ms_handle_reset con 0x55795a602800 session 0x55795a5b0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:42.931154+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 ms_handle_reset con 0x55795a305000 session 0x5579595afe00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124239872 unmapped: 37011456 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:43.931316+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 ms_handle_reset con 0x55795a9b4000 session 0x55795a609c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1376282 data_alloc: 184549376 data_used: 10862592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:44.931515+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:45.931703+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:46.931858+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 heartbeat osd_stat(store_statfs(0x1b7feb000/0x0/0x1bfc00000, data 0x35c3796/0x36a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:47.932041+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:48.932213+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1376282 data_alloc: 184549376 data_used: 10862592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:49.932361+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:50.932546+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124248064 unmapped: 37003264 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:51.932691+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.392381668s of 13.579803467s, submitted: 45
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:52.932909+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7fe6000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:53.933102+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1380484 data_alloc: 184549376 data_used: 10874880
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7fe6000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:54.933225+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:55.933339+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:56.933478+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:57.933645+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:58.933839+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7fe6000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1380484 data_alloc: 184549376 data_used: 10874880
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:57:59.934051+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:00.934549+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:01.934813+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:02.935018+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124256256 unmapped: 36995072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:03.935172+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124264448 unmapped: 36986880 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.401209831s of 11.420412064s, submitted: 14
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7fe6000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1381386 data_alloc: 184549376 data_used: 10874880
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7fe6000/0x0/0x1bfc00000, data 0x35c5a80/0x36a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:04.935467+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124272640 unmapped: 36978688 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:05.935822+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124272640 unmapped: 36978688 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:06.935938+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:07.936260+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7be7000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:08.936396+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1380513 data_alloc: 184549376 data_used: 10874880
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:09.936532+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:10.938121+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:11.938346+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:12.938566+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:13.938716+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1380513 data_alloc: 184549376 data_used: 10874880
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7be7000/0x0/0x1bfc00000, data 0x35c5a1e/0x36a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:14.938949+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:15.939109+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:16.939256+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x5579585a0800 session 0x557958300d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a9b5000 session 0x55795a637860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:17.939395+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125345792 unmapped: 35905536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.308190346s of 14.351776123s, submitted: 10
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a305000 session 0x5579587c03c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x5579585a0800 session 0x55795958a960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:18.939596+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124297216 unmapped: 36954112 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7be6000/0x0/0x1bfc00000, data 0x35c5a80/0x36a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1382310 data_alloc: 184549376 data_used: 10940416
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:19.939741+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 36937728 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a602800 session 0x55795d37b0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:20.939994+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124313600 unmapped: 36937728 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a9b4000 session 0x557958312f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:21.940197+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124329984 unmapped: 36921344 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795904bc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795e472000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795e472000 session 0x55795a286b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795904bc00 session 0x5579583123c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:22.940361+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124346368 unmapped: 36904960 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x5579585a0800 session 0x55795a58b2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:23.940501+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124346368 unmapped: 36904960 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7be4000/0x0/0x1bfc00000, data 0x35c5a2f/0x36a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1385467 data_alloc: 184549376 data_used: 10940416
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:24.940693+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124354560 unmapped: 36896768 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a305000 session 0x55795a58ab40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a602800 session 0x55795a58af00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:25.940858+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124403712 unmapped: 36847616 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a9b4000 session 0x55795a58a3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a9b5c00 session 0x55795d37a960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x5579585a0800 session 0x55795a58b0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:26.941040+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124411904 unmapped: 36839424 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b66cd000/0x0/0x1bfc00000, data 0x4adda3f/0x4bc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:27.941360+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124428288 unmapped: 36823040 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.423421860s of 10.002863884s, submitted: 109
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ef800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795ba70800 session 0x5579584285a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795a5ef800 session 0x5579585d3860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b66cd000/0x0/0x1bfc00000, data 0x4adda3f/0x4bc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:28.941520+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124444672 unmapped: 36806656 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 ms_handle_reset con 0x55795bde3800 session 0x5579585d30e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1463756 data_alloc: 184549376 data_used: 10940416
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:29.941718+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124452864 unmapped: 36798464 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:30.941939+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124452864 unmapped: 36798464 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2868 syncs, 3.58 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4956 writes, 16K keys, 4956 commit groups, 1.0 writes per commit group, ingest: 15.78 MB, 0.03 MB/s
                                                          Interval WAL: 4955 writes, 2127 syncs, 2.33 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:31.942106+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124461056 unmapped: 36790272 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 144 ms_handle_reset con 0x5579585a0800 session 0x55795a251680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 144 heartbeat osd_stat(store_statfs(0x1b7270000/0x0/0x1bfc00000, data 0x3f37dd8/0x401d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:10:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:32.942294+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124469248 unmapped: 36782080 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 144 heartbeat osd_stat(store_statfs(0x1b726b000/0x0/0x1bfc00000, data 0x3f3a23b/0x4022000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f58c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:33.942449+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 145 ms_handle_reset con 0x557958f58c00 session 0x55795a609e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 125288448 unmapped: 35962880 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 146 ms_handle_reset con 0x55795acc5000 session 0x55795a21f4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1527833 data_alloc: 184549376 data_used: 10952704
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:34.942617+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124780544 unmapped: 36470784 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:35.942841+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124805120 unmapped: 36446208 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:36.943035+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 146 ms_handle_reset con 0x5579585a0400 session 0x5579595cda40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4fc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124813312 unmapped: 36438016 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca4800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 146 heartbeat osd_stat(store_statfs(0x1b6d70000/0x0/0x1bfc00000, data 0x44309a7/0x451b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 146 ms_handle_reset con 0x55795aca4800 session 0x55795a6323c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:37.943194+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124084224 unmapped: 37167104 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.113096237s of 10.115760803s, submitted: 116
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 147 ms_handle_reset con 0x5579585a0400 session 0x55795a251680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 147 ms_handle_reset con 0x55795bd4fc00 session 0x5579595cd2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:38.943362+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124092416 unmapped: 37158912 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1616028 data_alloc: 184549376 data_used: 10964992
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:39.943494+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124092416 unmapped: 37158912 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 handle_osd_map epochs [147,148], i have 148, src has [1,148]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:40.943655+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124108800 unmapped: 37142528 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b6c54000/0x0/0x1bfc00000, data 0x44351d9/0x4524000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:41.943809+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124141568 unmapped: 37109760 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 149 ms_handle_reset con 0x5579585a0800 session 0x55795ef4d680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:42.944026+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124166144 unmapped: 37085184 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:43.944160+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124166144 unmapped: 37085184 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 handle_osd_map epochs [149,150], i have 150, src has [1,150]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 ms_handle_reset con 0x55795adb9400 session 0x55795bd46960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1439008 data_alloc: 184549376 data_used: 10981376
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:44.944332+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 ms_handle_reset con 0x55795a5ee400 session 0x5579587c0960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124190720 unmapped: 37060608 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:45.944473+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124190720 unmapped: 37060608 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 heartbeat osd_stat(store_statfs(0x1b7bc3000/0x0/0x1bfc00000, data 0x35d79e3/0x36ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:46.944689+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124190720 unmapped: 37060608 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f5c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 ms_handle_reset con 0x55795d3f5c00 session 0x5579587c1a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 ms_handle_reset con 0x55795bde3c00 session 0x5579587c1680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:47.944836+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124215296 unmapped: 37036032 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:48.945000+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.076016426s of 10.624968529s, submitted: 171
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124215296 unmapped: 37036032 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1450206 data_alloc: 184549376 data_used: 10985472
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:49.945167+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124215296 unmapped: 37036032 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 ms_handle_reset con 0x55795898c000 session 0x55795a634000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 ms_handle_reset con 0x55795bb69c00 session 0x55795a5b41e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 ms_handle_reset con 0x55795a305c00 session 0x55795d37a780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:50.945317+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124264448 unmapped: 36986880 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b7bbe000/0x0/0x1bfc00000, data 0x35d9d16/0x36d0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:51.969347+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 ms_handle_reset con 0x55795a5ee000 session 0x55795a6352c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 124272640 unmapped: 36978688 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 ms_handle_reset con 0x55795d1cd800 session 0x55795d392d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cc000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3e800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:52.969598+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 ms_handle_reset con 0x55795d1cc000 session 0x557957c20960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126304256 unmapped: 34947072 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 ms_handle_reset con 0x55795bd3e800 session 0x55795bd474a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 heartbeat osd_stat(store_statfs(0x1b69e7000/0x0/0x1bfc00000, data 0x47ae129/0x48a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc4400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 ms_handle_reset con 0x55795acc4400 session 0x55795d37b860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:53.969759+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126361600 unmapped: 34889728 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795a5ee000 session 0x55795ef4da40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1603902 data_alloc: 184549376 data_used: 11014144
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:54.969932+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126369792 unmapped: 34881536 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3e800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:55.970095+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795bd3e800 session 0x55795a222f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126377984 unmapped: 34873344 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:56.970259+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126377984 unmapped: 34873344 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:57.970394+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 heartbeat osd_stat(store_statfs(0x1b5845000/0x0/0x1bfc00000, data 0x47b04ac/0x48a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126386176 unmapped: 34865152 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:58.970535+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126386176 unmapped: 34865152 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54e800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.029722214s of 10.767278671s, submitted: 198
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795b54e800 session 0x5579595af2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1604731 data_alloc: 184549376 data_used: 11010048
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:58:59.970652+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795a9b5800 session 0x55795a21f0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126746624 unmapped: 34504704 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:00.970810+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126763008 unmapped: 34488320 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb69c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795bb69c00 session 0x5579585d3860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 ms_handle_reset con 0x55795a5ee000 session 0x5579584285a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:01.970968+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 126795776 unmapped: 34455552 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:02.971235+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134529024 unmapped: 26722304 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 154 heartbeat osd_stat(store_statfs(0x1b5813000/0x0/0x1bfc00000, data 0x47dc778/0x48d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:03.971929+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134578176 unmapped: 26673152 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:04.972607+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1708475 data_alloc: 201326592 data_used: 22724608
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134586368 unmapped: 26664960 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:05.972789+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 155 heartbeat osd_stat(store_statfs(0x1b580f000/0x0/0x1bfc00000, data 0x47deb7c/0x48de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134619136 unmapped: 26632192 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a603800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b580a000/0x0/0x1bfc00000, data 0x47e0f2f/0x48e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:06.973022+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 156 ms_handle_reset con 0x55795a603800 session 0x55795ae1e3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b580a000/0x0/0x1bfc00000, data 0x47e0f2f/0x48e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134709248 unmapped: 26542080 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9ec800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 157 ms_handle_reset con 0x55795898c000 session 0x557958302b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 157 ms_handle_reset con 0x557959afe400 session 0x557959b16f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:07.973277+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 134881280 unmapped: 26370048 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 158 ms_handle_reset con 0x55795aca4000 session 0x557959aedc20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 158 ms_handle_reset con 0x55795898c000 session 0x557959b163c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:08.973520+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 158 ms_handle_reset con 0x557959afe400 session 0x55795a6343c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 135028736 unmapped: 26222592 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:09.973704+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1836162 data_alloc: 201326592 data_used: 22740992
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.741077423s of 10.354707718s, submitted: 142
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 135077888 unmapped: 26173440 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 159 ms_handle_reset con 0x557958f59800 session 0x55795ae1e5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ee000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:10.973858+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143581184 unmapped: 17670144 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:11.974534+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 160 ms_handle_reset con 0x55795a5ee000 session 0x55795a58a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 137576448 unmapped: 23674880 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 160 heartbeat osd_stat(store_statfs(0x1b3ffc000/0x0/0x1bfc00000, data 0x5fe7c87/0x60f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,3])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:12.974681+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba71400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a602800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 160 ms_handle_reset con 0x55795a602800 session 0x55795a5b0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 142344192 unmapped: 18907136 heap: 161251328 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 160 handle_osd_map epochs [160,161], i have 160, src has [1,161]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:13.974845+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 ms_handle_reset con 0x557958f59800 session 0x55795ae1eb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144203776 unmapped: 25444352 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 ms_handle_reset con 0x55795898c000 session 0x55795ae1e960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 ms_handle_reset con 0x55795ba71400 session 0x55795a5b03c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:14.974951+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2309220 data_alloc: 201326592 data_used: 23805952
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144949248 unmapped: 24698880 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:15.975117+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 heartbeat osd_stat(store_statfs(0x1afadf000/0x0/0x1bfc00000, data 0xa500665/0xa60c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146227200 unmapped: 23420928 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:16.975295+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 162 ms_handle_reset con 0x557959afe400 session 0x55795d37a960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x55795d3f4000 session 0x55795a609c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146423808 unmapped: 23224320 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:17.975463+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x55795898c000 session 0x55795bd470e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 heartbeat osd_stat(store_statfs(0x1af2d7000/0x0/0x1bfc00000, data 0xad04dde/0xae15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146432000 unmapped: 23216128 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959afe400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x557959afe400 session 0x55795a21f4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba71400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x55795ba71400 session 0x55795a21eb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 heartbeat osd_stat(store_statfs(0x1af2d7000/0x0/0x1bfc00000, data 0xad04dde/0xae15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x557958f59800 session 0x55795d43d860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:18.975611+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146522112 unmapped: 23126016 heap: 169648128 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:19.975813+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2641605 data_alloc: 201326592 data_used: 24281088
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.490909576s of 10.103609085s, submitted: 364
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146653184 unmapped: 31391744 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 ms_handle_reset con 0x5579585a0000 session 0x55795d123680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:20.975988+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146874368 unmapped: 31170560 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a603800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:21.976226+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146882560 unmapped: 31162368 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3f400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:22.976416+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 ms_handle_reset con 0x55795bd3f400 session 0x55795a21e960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 ms_handle_reset con 0x55795a603800 session 0x55795bd47a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146956288 unmapped: 31088640 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:23.976618+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 heartbeat osd_stat(store_statfs(0x1aaad3000/0x0/0x1bfc00000, data 0xf50a156/0xf61b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 ms_handle_reset con 0x55795b54ec00 session 0x55795ae1f2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147046400 unmapped: 30998528 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8a400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 ms_handle_reset con 0x55795acc5000 session 0x557958301860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:24.976845+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3030876 data_alloc: 201326592 data_used: 24293376
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156598272 unmapped: 21446656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 ms_handle_reset con 0x55795bf8a400 session 0x55795a633680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a603800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 ms_handle_reset con 0x55795a603800 session 0x55795ae1fa40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:25.976990+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 ms_handle_reset con 0x55795b54ec00 session 0x5579595af2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 165 handle_osd_map epochs [165,166], i have 165, src has [1,166]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3f400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147931136 unmapped: 30113792 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 ms_handle_reset con 0x55795acc5000 session 0x55795d392960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 ms_handle_reset con 0x55795bd3f400 session 0x55795ef4d680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4f000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 ms_handle_reset con 0x55795bd4f000 session 0x55795d37b0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:26.977156+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156450816 unmapped: 21594112 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 ms_handle_reset con 0x55795adb8c00 session 0x55795d37a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:27.977300+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 148160512 unmapped: 29884416 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 heartbeat osd_stat(store_statfs(0x1a82c3000/0x0/0x1bfc00000, data 0x11d10d3a/0x11e29000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:28.977456+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156696576 unmapped: 21348352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:29.977544+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3450666 data_alloc: 201326592 data_used: 24317952
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156696576 unmapped: 21348352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a603800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.542109489s of 10.162899971s, submitted: 89
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 ms_handle_reset con 0x55795a603800 session 0x55795ae1f680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:30.977738+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 148504576 unmapped: 29540352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3f400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 168 ms_handle_reset con 0x55795b54ec00 session 0x55795d392000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:31.977933+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 handle_osd_map epochs [168,169], i have 169, src has [1,169]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 ms_handle_reset con 0x55795bd3f400 session 0x557959aec960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 ms_handle_reset con 0x55795adb9c00 session 0x5579585d3680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 148725760 unmapped: 29319168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 ms_handle_reset con 0x55795acc5000 session 0x55795ef4cf00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:32.978162+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a603800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 148725760 unmapped: 29319168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 ms_handle_reset con 0x55795adb8c00 session 0x5579585d2960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:33.978411+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 heartbeat osd_stat(store_statfs(0x1a3ab9000/0x0/0x1bfc00000, data 0x16515a43/0x16634000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 169 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 157351936 unmapped: 20692992 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 ms_handle_reset con 0x55795a603800 session 0x5579595ae1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 ms_handle_reset con 0x55795b54ec00 session 0x55795d1232c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:34.978563+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3885271 data_alloc: 201326592 data_used: 24342528
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 46
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 148963328 unmapped: 29081600 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:35.978705+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 heartbeat osd_stat(store_statfs(0x1a22b1000/0x0/0x1bfc00000, data 0x17d1d9c5/0x17e3d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 149037056 unmapped: 29007872 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd3f400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4fc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 ms_handle_reset con 0x55795bd3f400 session 0x55795a637c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 ms_handle_reset con 0x55795bd4fc00 session 0x557958313c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:36.978812+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 149209088 unmapped: 28835840 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 171 heartbeat osd_stat(store_statfs(0x1a1aaf000/0x0/0x1bfc00000, data 0x1851da77/0x1863f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 171 ms_handle_reset con 0x557958fd9c00 session 0x55795d37a1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ef800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:37.979072+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 ms_handle_reset con 0x5579595cb800 session 0x55795ae1e960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 149340160 unmapped: 28704768 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 ms_handle_reset con 0x55795a5ef800 session 0x55795d393680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a304400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:38.979171+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 ms_handle_reset con 0x55795a304400 session 0x55795cca1a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 149454848 unmapped: 28590080 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 heartbeat osd_stat(store_statfs(0x1a02a4000/0x0/0x1bfc00000, data 0x19d228a6/0x19e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:39.979304+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4286646 data_alloc: 201326592 data_used: 24375296
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 149536768 unmapped: 28508160 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.151728630s of 10.196173668s, submitted: 140
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 ms_handle_reset con 0x557958fd9c00 session 0x55795a58b0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:40.979437+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ef800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 ms_handle_reset con 0x55795a5ef800 session 0x55795ae1ef00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4fc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 151879680 unmapped: 26165248 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 47
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 173 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 174 ms_handle_reset con 0x5579595cb800 session 0x55795cca1c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:41.979583+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 174 ms_handle_reset con 0x55795bd4fc00 session 0x55795a632f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152231936 unmapped: 25812992 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:42.979746+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152346624 unmapped: 25698304 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:43.979860+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795e472c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 175 ms_handle_reset con 0x55795e472c00 session 0x55795ae1f2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152403968 unmapped: 25640960 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 175 ms_handle_reset con 0x557958fd9c00 session 0x55795cca0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:44.980142+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 heartbeat osd_stat(store_statfs(0x19bcea000/0x0/0x1bfc00000, data 0x1cd34925/0x1ce64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4532334 data_alloc: 201326592 data_used: 24379392
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a5ef800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 ms_handle_reset con 0x5579595cb800 session 0x55795cca10e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153657344 unmapped: 24387584 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 ms_handle_reset con 0x55795ba70c00 session 0x557959aec1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:45.980350+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 ms_handle_reset con 0x55795a5ef800 session 0x55795d392b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4fc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 ms_handle_reset con 0x55795bd4fc00 session 0x55795a5b0f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 heartbeat osd_stat(store_statfs(0x19b4e3000/0x0/0x1bfc00000, data 0x1d5392cb/0x1d66a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153796608 unmapped: 24248320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 heartbeat osd_stat(store_statfs(0x19b4e3000/0x0/0x1bfc00000, data 0x1d5392cb/0x1d66a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 ms_handle_reset con 0x557958fd9c00 session 0x557959aed860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:46.980482+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153935872 unmapped: 24109056 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 ms_handle_reset con 0x55795a9b5400 session 0x55795a58b2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 ms_handle_reset con 0x55795ad10400 session 0x55795ae1f0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 heartbeat osd_stat(store_statfs(0x1994d6000/0x0/0x1bfc00000, data 0x1f542f72/0x1f677000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:47.980632+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 heartbeat osd_stat(store_statfs(0x1994d6000/0x0/0x1bfc00000, data 0x1f542f72/0x1f677000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,3,4,5] op hist [0,1,0,0,5,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145760256 unmapped: 32284672 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 ms_handle_reset con 0x557958f59800 session 0x557958302b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 ms_handle_reset con 0x55795acc5000 session 0x55795a58a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:48.980744+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 ms_handle_reset con 0x55795a9ec800 session 0x5579595ae3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145235968 unmapped: 32808960 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:49.980944+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4614922 data_alloc: 184549376 data_used: 11186176
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d3f5c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 ms_handle_reset con 0x55795d3f5c00 session 0x55795d1234a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 ms_handle_reset con 0x55795a9b5400 session 0x55795d123e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145334272 unmapped: 32710656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.265284538s of 10.141423225s, submitted: 349
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:50.981131+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 heartbeat osd_stat(store_statfs(0x19b0de000/0x0/0x1bfc00000, data 0x1e919a52/0x1ea50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 180 ms_handle_reset con 0x557958fd9c00 session 0x5579595aeb40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1ccc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 180 ms_handle_reset con 0x55795d1ccc00 session 0x5579585d2960
Feb 23 10:10:39 np0005626463.localdomain podman[242954]: @ - - [23/Feb/2026:10:10:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18832 "" "Go-http-client/1.1"
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144695296 unmapped: 33349632 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 180 ms_handle_reset con 0x557958f59800 session 0x557959aed680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 180 ms_handle_reset con 0x55795ad10400 session 0x55795d123a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:51.981279+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 181 heartbeat osd_stat(store_statfs(0x1b60d9000/0x0/0x1bfc00000, data 0x391bfe6/0x3a54000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc4c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143089664 unmapped: 34955264 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad11000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795adb9800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 181 ms_handle_reset con 0x55795adb9800 session 0x5579585d2780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:52.981526+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 182 ms_handle_reset con 0x55795ad11000 session 0x55795d37b4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143122432 unmapped: 34922496 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898c800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 182 ms_handle_reset con 0x55795898c800 session 0x55795d37af00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:53.981682+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 182 heartbeat osd_stat(store_statfs(0x1b60d3000/0x0/0x1bfc00000, data 0x39245cb/0x3a5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143196160 unmapped: 34848768 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 183 ms_handle_reset con 0x557958f59800 session 0x55795d37a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:54.981844+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1729958 data_alloc: 184549376 data_used: 11210752
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 184 ms_handle_reset con 0x55795acc4c00 session 0x55795a6341e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143253504 unmapped: 34791424 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:55.982015+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 143228928 unmapped: 34816000 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:56.982173+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 heartbeat osd_stat(store_statfs(0x1b60c3000/0x0/0x1bfc00000, data 0x3936015/0x3a6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144392192 unmapped: 33652736 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:57.982354+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144408576 unmapped: 33636352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:58.982540+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144408576 unmapped: 33636352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 heartbeat osd_stat(store_statfs(0x1b60bf000/0x0/0x1bfc00000, data 0x393c86a/0x3a6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T09:59:59.982688+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1732449 data_alloc: 184549376 data_used: 11223040
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144482304 unmapped: 33562624 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795e472400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.567652702s of 10.004901886s, submitted: 373
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 ms_handle_reset con 0x55795e472400 session 0x55795d43c000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:00.982834+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 185 handle_osd_map epochs [185,186], i have 185, src has [1,186]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144482304 unmapped: 33562624 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8b800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cc000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 ms_handle_reset con 0x55795bf8b800 session 0x55795a222960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:01.982980+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595ca000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 ms_handle_reset con 0x5579595ca000 session 0x55795a634d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 heartbeat osd_stat(store_statfs(0x1b60b1000/0x0/0x1bfc00000, data 0x3949cd5/0x3a7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 187 ms_handle_reset con 0x55795d1cc000 session 0x557959554000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 187 ms_handle_reset con 0x557958f59800 session 0x5579587c0780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 187 heartbeat osd_stat(store_statfs(0x1b60b1000/0x0/0x1bfc00000, data 0x3949cd5/0x3a7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144556032 unmapped: 33488896 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:02.983197+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc4c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144556032 unmapped: 33488896 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:03.983337+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 ms_handle_reset con 0x55795acc4c00 session 0x55795a633e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145711104 unmapped: 32333824 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:04.983541+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1747190 data_alloc: 184549376 data_used: 11251712
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145711104 unmapped: 32333824 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:05.983688+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bf8b800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 ms_handle_reset con 0x55795bf8b800 session 0x55795d1223c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144949248 unmapped: 33095680 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:06.983917+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 heartbeat osd_stat(store_statfs(0x1b577e000/0x0/0x1bfc00000, data 0x427784f/0x43b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144949248 unmapped: 33095680 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:07.984149+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144949248 unmapped: 33095680 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:08.984374+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144949248 unmapped: 33095680 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:09.984547+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 heartbeat osd_stat(store_statfs(0x1b577b000/0x0/0x1bfc00000, data 0x427a0ae/0x43b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1828211 data_alloc: 184549376 data_used: 11255808
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc4800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 ms_handle_reset con 0x55795acc4800 session 0x55795a21f2c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144769024 unmapped: 33275904 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:10.984627+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.806262016s of 10.465649605s, submitted: 178
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 188 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144064512 unmapped: 33980416 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:11.984738+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54f800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 189 ms_handle_reset con 0x55795b54f800 session 0x55795a5b01e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144138240 unmapped: 33906688 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:12.984921+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 ms_handle_reset con 0x557958fd8c00 session 0x5579587c1e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144187392 unmapped: 33857536 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b6083000/0x0/0x1bfc00000, data 0x396bcc7/0x3aa9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:13.985082+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 ms_handle_reset con 0x5579595cbc00 session 0x55795ef4d4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557959affc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144228352 unmapped: 33816576 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 ms_handle_reset con 0x557959affc00 session 0x55795a2861e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd9c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:14.985209+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 ms_handle_reset con 0x557958fd9c00 session 0x55795a21f0e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1778746 data_alloc: 184549376 data_used: 11268096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 ms_handle_reset con 0x557958fd8c00 session 0x55795a21fc20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144220160 unmapped: 33824768 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:15.985341+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 192 ms_handle_reset con 0x5579595cbc00 session 0x55795cca12c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144277504 unmapped: 33767424 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:16.985499+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144277504 unmapped: 33767424 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:17.985586+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144302080 unmapped: 33742848 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 ms_handle_reset con 0x55795aca4000 session 0x55795d393c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 ms_handle_reset con 0x557957ff0000 session 0x557957c20000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:18.985715+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 ms_handle_reset con 0x55795bd4d800 session 0x557959aedc20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 ms_handle_reset con 0x557958fd8c00 session 0x557959559680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 heartbeat osd_stat(store_statfs(0x1b605b000/0x0/0x1bfc00000, data 0x398f1af/0x3ad2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144351232 unmapped: 33693696 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 ms_handle_reset con 0x557957ff0000 session 0x55795a5b5c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:19.985906+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 ms_handle_reset con 0x5579595cbc00 session 0x557957c21860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4d800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 ms_handle_reset con 0x55795aca4000 session 0x55795a2861e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1794889 data_alloc: 184549376 data_used: 11292672
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144375808 unmapped: 33669120 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 ms_handle_reset con 0x55795bd4d800 session 0x557957c23c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:20.986118+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 ms_handle_reset con 0x557957ff0000 session 0x557959559860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 ms_handle_reset con 0x557958fd8c00 session 0x5579595cda40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.988972664s of 10.099328995s, submitted: 283
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144408576 unmapped: 33636352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 196 ms_handle_reset con 0x5579595cbc00 session 0x55795ef4d860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:21.986258+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 196 ms_handle_reset con 0x55795d1cd800 session 0x55795ef4d4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144465920 unmapped: 33579008 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 197 ms_handle_reset con 0x55795a9b5800 session 0x55795a21f4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:22.986420+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144474112 unmapped: 33570816 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 ms_handle_reset con 0x557957ff0000 session 0x55795a635e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:23.986680+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144547840 unmapped: 33497088 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 ms_handle_reset con 0x557958fd8c00 session 0x55795a250f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 heartbeat osd_stat(store_statfs(0x1b6036000/0x0/0x1bfc00000, data 0x39a97ea/0x3af7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:24.986843+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1806254 data_alloc: 184549376 data_used: 11313152
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 ms_handle_reset con 0x5579595cbc00 session 0x557959ab7c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 ms_handle_reset con 0x55795d1cd800 session 0x55795a635a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144318464 unmapped: 33726464 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898d000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 ms_handle_reset con 0x55795898d000 session 0x557959555c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:25.986972+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 ms_handle_reset con 0x557957ff0000 session 0x557958428780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144351232 unmapped: 33693696 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 ms_handle_reset con 0x557958fd8c00 session 0x557958429a40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:26.987101+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 ms_handle_reset con 0x55795ad10c00 session 0x55795d123c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 144375808 unmapped: 33669120 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 199 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:27.987257+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 200 ms_handle_reset con 0x5579595cbc00 session 0x55795cca1680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc4000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 200 ms_handle_reset con 0x55795acc4000 session 0x55795d392b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145481728 unmapped: 32563200 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:28.987407+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 ms_handle_reset con 0x557957ff0000 session 0x55795d392780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 ms_handle_reset con 0x557958fd8c00 session 0x55795d122d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 handle_osd_map epochs [200,201], i have 201, src has [1,201]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 ms_handle_reset con 0x55795ad10c00 session 0x55795d1230e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb05c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 145539072 unmapped: 32505856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:29.987583+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 ms_handle_reset con 0x5579595cbc00 session 0x55795d3925a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 heartbeat osd_stat(store_statfs(0x1b5c19000/0x0/0x1bfc00000, data 0x39c0194/0x3b14000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 ms_handle_reset con 0x55795bb05c00 session 0x55795d1234a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1876743 data_alloc: 184549376 data_used: 11337728
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 ms_handle_reset con 0x557957ff0000 session 0x55795bd47860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146505728 unmapped: 31539200 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:30.987721+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958fd8c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 ms_handle_reset con 0x557958fd8c00 session 0x55795bd46f00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cbc00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.412206650s of 10.003239632s, submitted: 358
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 ms_handle_reset con 0x5579595cbc00 session 0x55795bd47680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146530304 unmapped: 31514624 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:31.987926+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146554880 unmapped: 31490048 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:32.988117+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ad10400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 203 ms_handle_reset con 0x55795ad10400 session 0x55795cca1680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:33.988257+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 203 ms_handle_reset con 0x55795d1cd000 session 0x55795d123c20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146579456 unmapped: 31465472 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:34.988401+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1827624 data_alloc: 184549376 data_used: 11350016
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b5c0a000/0x0/0x1bfc00000, data 0x39d11fe/0x3b23000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146579456 unmapped: 31465472 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557957ff0000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:35.988577+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146636800 unmapped: 31408128 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:36.988759+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146661376 unmapped: 31383552 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:37.988930+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146669568 unmapped: 31375360 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:38.989083+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146677760 unmapped: 31367168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:39.989220+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1842028 data_alloc: 184549376 data_used: 11362304
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146677760 unmapped: 31367168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:40.989399+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b5bd4000/0x0/0x1bfc00000, data 0x3a031e8/0x3b59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146677760 unmapped: 31367168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:41.989577+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146677760 unmapped: 31367168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.078253746s of 11.561590195s, submitted: 165
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:42.990979+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b5bd0000/0x0/0x1bfc00000, data 0x3a054dc/0x3b5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146694144 unmapped: 31350784 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:43.991149+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146694144 unmapped: 31350784 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:44.991302+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1843044 data_alloc: 184549376 data_used: 11362304
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146718720 unmapped: 31326208 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:45.991497+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146718720 unmapped: 31326208 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:46.991750+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b5bc3000/0x0/0x1bfc00000, data 0x3a134e6/0x3b6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146726912 unmapped: 31318016 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:47.991978+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbe000/0x0/0x1bfc00000, data 0x3a1578e/0x3b6f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146726912 unmapped: 31318016 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:48.992378+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146726912 unmapped: 31318016 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:49.992572+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1845862 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:50.992753+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:51.992917+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:52.993124+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:53.993304+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbc000/0x0/0x1bfc00000, data 0x3a183a4/0x3b72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:54.993444+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.919555664s of 12.039855003s, submitted: 38
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1846094 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:55.993580+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:56.993718+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146563072 unmapped: 31481856 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:57.993931+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbb000/0x0/0x1bfc00000, data 0x3a19509/0x3b73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:58.994086+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:00:59.994232+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1846094 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:00.994357+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbb000/0x0/0x1bfc00000, data 0x3a19509/0x3b73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:01.994515+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbb000/0x0/0x1bfc00000, data 0x3a19509/0x3b73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:02.994708+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:03.994913+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:04.995162+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1846094 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:05.995373+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 146571264 unmapped: 31473664 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:06.995505+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5bbb000/0x0/0x1bfc00000, data 0x3a19509/0x3b73000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.222109795s of 12.228106499s, submitted: 1
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147628032 unmapped: 30416896 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:07.995687+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147628032 unmapped: 30416896 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:08.995830+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147644416 unmapped: 30400512 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:09.995980+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1851014 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147644416 unmapped: 30400512 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:10.996108+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5b97000/0x0/0x1bfc00000, data 0x3a3ad22/0x3b97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147652608 unmapped: 30392320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:11.996230+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147652608 unmapped: 30392320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:12.996411+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147652608 unmapped: 30392320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:13.996580+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147652608 unmapped: 30392320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:14.996729+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1855934 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147668992 unmapped: 30375936 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:15.996838+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147685376 unmapped: 30359552 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:16.996988+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.059313774s of 10.282251358s, submitted: 39
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5b70000/0x0/0x1bfc00000, data 0x3a62640/0x3bbe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147718144 unmapped: 30326784 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:17.997128+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147718144 unmapped: 30326784 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:18.997266+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cac00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 ms_handle_reset con 0x5579595cac00 session 0x557959555e00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147832832 unmapped: 30212096 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:19.997407+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b5b47000/0x0/0x1bfc00000, data 0x3a87125/0x3be6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1863788 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147857408 unmapped: 30187520 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:20.997549+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147857408 unmapped: 30187520 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:21.997680+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147898368 unmapped: 30146560 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:22.997854+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 147898368 unmapped: 30146560 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:23.998030+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150061056 unmapped: 27983872 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:24.998178+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1885985 data_alloc: 184549376 data_used: 11374592
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150134784 unmapped: 27910144 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:25.998341+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 208 heartbeat osd_stat(store_statfs(0x1b5ad8000/0x0/0x1bfc00000, data 0x3af3f6a/0x3c55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150339584 unmapped: 27705344 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:26.998506+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.379565239s of 10.008890152s, submitted: 183
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150396928 unmapped: 27648000 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:27.998642+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150446080 unmapped: 27598848 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:28.998810+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150773760 unmapped: 27271168 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:29.998965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1891457 data_alloc: 184549376 data_used: 11399168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150863872 unmapped: 27181056 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:30.999054+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 210 heartbeat osd_stat(store_statfs(0x1b5a6a000/0x0/0x1bfc00000, data 0x3b60f68/0x3cc3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 150896640 unmapped: 27148288 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:31.999182+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 151896064 unmapped: 26148864 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:32.999356+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 151937024 unmapped: 26107904 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:33.999487+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 211 heartbeat osd_stat(store_statfs(0x1b5a43000/0x0/0x1bfc00000, data 0x3b84137/0x3ce9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 151961600 unmapped: 26083328 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:34.999601+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1903081 data_alloc: 184549376 data_used: 11415552
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152051712 unmapped: 25993216 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:35.999716+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152051712 unmapped: 25993216 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:36.999840+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.099133492s of 10.000268936s, submitted: 237
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152240128 unmapped: 25804800 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:37.999936+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 25731072 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:39.000082+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 213 heartbeat osd_stat(store_statfs(0x1b59cb000/0x0/0x1bfc00000, data 0x3bf85cd/0x3d63000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 25731072 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:40.000229+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1920928 data_alloc: 184549376 data_used: 11440128
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152444928 unmapped: 25600000 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:41.000419+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 214 heartbeat osd_stat(store_statfs(0x1b59cb000/0x0/0x1bfc00000, data 0x3bf94da/0x3d63000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 152469504 unmapped: 25575424 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:42.000565+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153526272 unmapped: 24518656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:43.000727+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b5982000/0x0/0x1bfc00000, data 0x3c3c032/0x3daa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153526272 unmapped: 24518656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:44.000920+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:45.001072+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153534464 unmapped: 24510464 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1932402 data_alloc: 184549376 data_used: 11444224
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:46.002301+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153534464 unmapped: 24510464 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 216 heartbeat osd_stat(store_statfs(0x1b5955000/0x0/0x1bfc00000, data 0x3c6b9b6/0x3dd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:47.002478+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153534464 unmapped: 24510464 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.538256645s of 10.021533012s, submitted: 168
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 216 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:48.002864+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153559040 unmapped: 24485888 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:49.003226+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153624576 unmapped: 24420352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:50.003951+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153665536 unmapped: 24379392 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1944216 data_alloc: 184549376 data_used: 11456512
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:51.004154+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 155033600 unmapped: 23011328 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:52.004854+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153985024 unmapped: 24059904 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b5901000/0x0/0x1bfc00000, data 0x3cbe91e/0x3e2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:53.005345+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153853952 unmapped: 24190976 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70c00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 ms_handle_reset con 0x55795ba70c00 session 0x55795a634d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:54.006311+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 153608192 unmapped: 24436736 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde2400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 ms_handle_reset con 0x55795bde2400 session 0x55795a6341e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:55.006462+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 155820032 unmapped: 22224896 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1960008 data_alloc: 184549376 data_used: 11456512
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:56.006651+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 155828224 unmapped: 22216704 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:57.006843+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156033024 unmapped: 22011904 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b5860000/0x0/0x1bfc00000, data 0x3d5ce84/0x3ece000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.469116211s of 10.016029358s, submitted: 133
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 218 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:58.400818+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156033024 unmapped: 22011904 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:01:59.400981+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156246016 unmapped: 21798912 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795ba70800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b57e8000/0x0/0x1bfc00000, data 0x3dd0a3b/0x3f45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:00.401165+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 155508736 unmapped: 22536192 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 ms_handle_reset con 0x55795ba70800 session 0x55795d37a5a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1976262 data_alloc: 184549376 data_used: 11468800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a305000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:01.401324+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156614656 unmapped: 21430272 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 ms_handle_reset con 0x55795a305000 session 0x557959aed680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cac00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 ms_handle_reset con 0x5579595cac00 session 0x5579595ae3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:02.401757+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156418048 unmapped: 21626880 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:03.401969+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156680192 unmapped: 21364736 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:04.402104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156680192 unmapped: 21364736 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b57af000/0x0/0x1bfc00000, data 0x3e0ddc4/0x3f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:05.402281+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156696576 unmapped: 21348352 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1981871 data_alloc: 184549376 data_used: 11468800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:06.405186+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156868608 unmapped: 21176320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b5775000/0x0/0x1bfc00000, data 0x3e4839c/0x3fb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:07.405355+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156868608 unmapped: 21176320 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.852539062s of 10.469260216s, submitted: 155
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:08.405513+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 157917184 unmapped: 20127744 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:09.405645+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158048256 unmapped: 19996672 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b5751000/0x0/0x1bfc00000, data 0x3e6d9d6/0x3fdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:10.405792+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158048256 unmapped: 19996672 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1979137 data_alloc: 184549376 data_used: 11468800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b5751000/0x0/0x1bfc00000, data 0x3e6d9d6/0x3fdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:11.405941+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158048256 unmapped: 19996672 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:12.406135+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156598272 unmapped: 21446656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:13.406311+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156598272 unmapped: 21446656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:14.406505+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156598272 unmapped: 21446656 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:15.406676+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156712960 unmapped: 21331968 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1977329 data_alloc: 184549376 data_used: 11468800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b572d000/0x0/0x1bfc00000, data 0x3e9231d/0x4001000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:16.406821+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 157761536 unmapped: 20283392 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b5721000/0x0/0x1bfc00000, data 0x3e9d60d/0x400c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b5721000/0x0/0x1bfc00000, data 0x3e9d60d/0x400c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:17.406946+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 157761536 unmapped: 20283392 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.954599380s of 10.163270950s, submitted: 44
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:18.407148+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156712960 unmapped: 21331968 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898d400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 220 ms_handle_reset con 0x55795898d400 session 0x55795a5b0b40
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:19.407279+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156729344 unmapped: 21315584 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:20.407444+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156819456 unmapped: 21225472 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1989533 data_alloc: 184549376 data_used: 11481088
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b56e3000/0x0/0x1bfc00000, data 0x3ed8a3d/0x404a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:21.407622+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156876800 unmapped: 21168128 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:22.407769+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156999680 unmapped: 21045248 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:23.407926+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156999680 unmapped: 21045248 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:24.408040+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 156983296 unmapped: 21061632 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795aca4400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 ms_handle_reset con 0x55795aca4400 session 0x55795a58a1e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579595cb400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 ms_handle_reset con 0x5579595cb400 session 0x5579595552c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:25.408173+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158089216 unmapped: 19955712 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2015548 data_alloc: 184549376 data_used: 11493376
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795acc5400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 ms_handle_reset con 0x55795acc5400 session 0x55795d1223c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:26.408307+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158097408 unmapped: 19947520 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795898d400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 ms_handle_reset con 0x55795898d400 session 0x5579587c14a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 heartbeat osd_stat(store_statfs(0x1b5240000/0x0/0x1bfc00000, data 0x3f74ee1/0x40eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 221 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:27.408440+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158375936 unmapped: 19668992 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.487211227s of 10.073509216s, submitted: 132
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:28.408664+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158375936 unmapped: 19668992 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 48
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:29.408804+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 222 ms_handle_reset con 0x55795d1cd400 session 0x557958303680
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158392320 unmapped: 19652608 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:30.447006+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 19447808 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2027725 data_alloc: 184549376 data_used: 11505664
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:31.447157+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 158597120 unmapped: 19447808 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:32.447329+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 159981568 unmapped: 18063360 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b518d000/0x0/0x1bfc00000, data 0x401bd9b/0x419a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:33.447545+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160071680 unmapped: 17973248 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795d1cd800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b5195000/0x0/0x1bfc00000, data 0x401bd8a/0x4199000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 ms_handle_reset con 0x55795d1cd800 session 0x55795a21e000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:34.447689+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160079872 unmapped: 17965056 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:35.447818+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160145408 unmapped: 17899520 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2035990 data_alloc: 184549376 data_used: 11534336
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 224 heartbeat osd_stat(store_statfs(0x1b5168000/0x0/0x1bfc00000, data 0x4046d3f/0x41c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:36.448002+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 17776640 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:37.448165+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 17776640 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:38.448361+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 17776640 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:39.448515+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160268288 unmapped: 17776640 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.155772209s of 11.742358208s, submitted: 152
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:40.448664+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160366592 unmapped: 17678336 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2049502 data_alloc: 184549376 data_used: 11550720
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:41.448845+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 160382976 unmapped: 17661952 heap: 178044928 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 225 heartbeat osd_stat(store_statfs(0x1b50e8000/0x0/0x1bfc00000, data 0x40c1d2f/0x4242000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:42.449019+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 161472512 unmapped: 17620992 heap: 179093504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:43.449225+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 161472512 unmapped: 17620992 heap: 179093504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:44.449391+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 161472512 unmapped: 17620992 heap: 179093504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:45.449543+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 161488896 unmapped: 17604608 heap: 179093504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 226 heartbeat osd_stat(store_statfs(0x1b507c000/0x0/0x1bfc00000, data 0x412f0aa/0x42b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 226 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 226 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063528 data_alloc: 184549376 data_used: 11575296
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b507c000/0x0/0x1bfc00000, data 0x412f0aa/0x42b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:46.449775+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162545664 unmapped: 16547840 heap: 179093504 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b505e000/0x0/0x1bfc00000, data 0x414ac9c/0x42ce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:47.449949+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162545664 unmapped: 17596416 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:48.450169+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162660352 unmapped: 17481728 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:49.450321+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162660352 unmapped: 17481728 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.391491890s of 10.005722046s, submitted: 147
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:50.450486+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162660352 unmapped: 17481728 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2062673 data_alloc: 184549376 data_used: 11575296
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:51.450625+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b5045000/0x0/0x1bfc00000, data 0x41640ef/0x42e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162660352 unmapped: 17481728 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:52.450732+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162668544 unmapped: 17473536 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:53.450931+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162676736 unmapped: 17465344 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:54.451135+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 162676736 unmapped: 17465344 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 229 handle_osd_map epochs [228,229], i have 229, src has [1,229]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:55.451313+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163848192 unmapped: 16293888 heap: 180142080 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x557958f59800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2087843 data_alloc: 184549376 data_used: 11587584
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 229 ms_handle_reset con 0x557958f59800 session 0x55795a21e3c0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:56.451510+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163848192 unmapped: 17342464 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b3e22000/0x0/0x1bfc00000, data 0x41e1919/0x436b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:57.451701+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163373056 unmapped: 17817600 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:58.451854+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b3df6000/0x0/0x1bfc00000, data 0x420b675/0x4397000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163463168 unmapped: 17727488 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:02:59.452005+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163479552 unmapped: 17711104 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.449725151s of 10.028799057s, submitted: 156
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bde3400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 ms_handle_reset con 0x55795bde3400 session 0x55795d122d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:00.452136+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164560896 unmapped: 16629760 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2091563 data_alloc: 184549376 data_used: 11612160
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:01.452305+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164569088 unmapped: 16621568 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b3d56000/0x0/0x1bfc00000, data 0x42acb2b/0x4436000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:02.452422+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164577280 unmapped: 16613376 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:03.452627+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163618816 unmapped: 17571840 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b3d44000/0x0/0x1bfc00000, data 0x42c2e8f/0x444a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:04.452777+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 163700736 unmapped: 17489920 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:05.452950+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164823040 unmapped: 16367616 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b3cd0000/0x0/0x1bfc00000, data 0x4332912/0x44bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2115793 data_alloc: 184549376 data_used: 11612160
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:06.453132+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 165093376 unmapped: 16097280 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:07.453277+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b3cd2000/0x0/0x1bfc00000, data 0x4332a41/0x44bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 231 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 165093376 unmapped: 16097280 heap: 181190656 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:08.453486+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 165101568 unmapped: 17137664 heap: 182239232 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:09.453638+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164806656 unmapped: 17432576 heap: 182239232 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 233 heartbeat osd_stat(store_statfs(0x1b3c7a000/0x0/0x1bfc00000, data 0x43850e5/0x4513000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.656097412s of 10.382403374s, submitted: 200
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:10.453812+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 164921344 unmapped: 17317888 heap: 182239232 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2128985 data_alloc: 184549376 data_used: 11636736
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 234 heartbeat osd_stat(store_statfs(0x1b3c63000/0x0/0x1bfc00000, data 0x4398131/0x4529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:11.453952+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795a9b5400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 234 ms_handle_reset con 0x55795a9b5400 session 0x55795a636d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167043072 unmapped: 16244736 heap: 183287808 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:12.454117+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 234 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795e472800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 235 ms_handle_reset con 0x55795e472800 session 0x557957c201e0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167346176 unmapped: 15941632 heap: 183287808 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:13.454311+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 235 handle_osd_map epochs [235,236], i have 235, src has [1,236]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167428096 unmapped: 15859712 heap: 183287808 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:14.454505+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 236 heartbeat osd_stat(store_statfs(0x1b2a56000/0x0/0x1bfc00000, data 0x4406040/0x4596000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167436288 unmapped: 15851520 heap: 183287808 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:15.454658+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167501824 unmapped: 16834560 heap: 184336384 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2139359 data_alloc: 184549376 data_used: 11644928
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:16.454798+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167501824 unmapped: 16834560 heap: 184336384 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:17.454965+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795b54ec00
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 167731200 unmapped: 16605184 heap: 184336384 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 237 ms_handle_reset con 0x55795b54ec00 session 0x55795a222d20
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:18.455146+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b29be000/0x0/0x1bfc00000, data 0x449cdcc/0x462e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170106880 unmapped: 15278080 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:19.455308+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169058304 unmapped: 16326656 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:20.455529+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.384524345s of 10.359848022s, submitted: 251
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169058304 unmapped: 16326656 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2156759 data_alloc: 184549376 data_used: 11677696
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x5579585a1400
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 ms_handle_reset con 0x5579585a1400 session 0x5579595af4a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:21.455672+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 heartbeat osd_stat(store_statfs(0x1b2973000/0x0/0x1bfc00000, data 0x44e4418/0x4679000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169230336 unmapped: 16154624 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:22.455838+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169533440 unmapped: 15851520 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:23.456126+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169541632 unmapped: 15843328 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 handle_osd_map epochs [239,240], i have 240, src has [1,240]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:24.457015+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b290d000/0x0/0x1bfc00000, data 0x45445c3/0x46de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 169844736 unmapped: 15540224 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:25.457536+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2169043 data_alloc: 184549376 data_used: 11694080
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:26.457781+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:27.457920+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:28.458233+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:29.458395+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b28d5000/0x0/0x1bfc00000, data 0x4581ce8/0x4718000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:30.458521+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2165479 data_alloc: 184549376 data_used: 11694080
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.015050888s of 10.540500641s, submitted: 93
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:31.458773+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:32.458926+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170893312 unmapped: 14491648 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 241 heartbeat osd_stat(store_statfs(0x1b28d3000/0x0/0x1bfc00000, data 0x4581ee4/0x4719000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:33.459227+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170917888 unmapped: 14467072 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:34.459444+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170917888 unmapped: 14467072 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:35.459710+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170926080 unmapped: 14458880 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2175996 data_alloc: 184549376 data_used: 11718656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 49
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:36.459912+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170934272 unmapped: 14450688 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 heartbeat osd_stat(store_statfs(0x1b28cd000/0x0/0x1bfc00000, data 0x4586902/0x4721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:37.460048+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170934272 unmapped: 14450688 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:38.460202+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170942464 unmapped: 14442496 heap: 185384960 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:39.460315+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170942464 unmapped: 15491072 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:40.460456+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170942464 unmapped: 15491072 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2173898 data_alloc: 184549376 data_used: 11718656
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:41.460613+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.656688690s of 10.146930695s, submitted: 129
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170950656 unmapped: 15482880 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 heartbeat osd_stat(store_statfs(0x1b28cc000/0x0/0x1bfc00000, data 0x45869cc/0x4721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:42.460771+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 heartbeat osd_stat(store_statfs(0x1b28cc000/0x0/0x1bfc00000, data 0x45869cc/0x4721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 242 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170958848 unmapped: 15474688 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c8000/0x0/0x1bfc00000, data 0x4588c74/0x4725000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:43.460961+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170958848 unmapped: 15474688 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:44.461203+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170975232 unmapped: 15458304 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:45.461384+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170975232 unmapped: 15458304 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2182228 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:46.461572+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170975232 unmapped: 15458304 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:47.461795+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170975232 unmapped: 15458304 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:48.462015+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c5000/0x0/0x1bfc00000, data 0x4588f9f/0x4728000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:49.462172+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:50.462347+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c3000/0x0/0x1bfc00000, data 0x45890d5/0x472a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2185466 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:51.462540+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c1000/0x0/0x1bfc00000, data 0x458916c/0x472a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:52.462750+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c1000/0x0/0x1bfc00000, data 0x458916c/0x472a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:53.462949+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 170983424 unmapped: 15450112 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:54.463173+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.917658806s of 13.181193352s, submitted: 56
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171008000 unmapped: 15425536 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b28c6000/0x0/0x1bfc00000, data 0x45891f6/0x4727000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:55.463326+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171016192 unmapped: 15417344 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2192476 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:56.463502+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171220992 unmapped: 15212544 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:57.463622+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171237376 unmapped: 15196160 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:58.463802+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171237376 unmapped: 15196160 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:03:59.463930+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 171237376 unmapped: 15196160 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:00.464115+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 172449792 unmapped: 13983744 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b2824000/0x0/0x1bfc00000, data 0x462c1ba/0x47ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2200240 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:01.464276+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b2824000/0x0/0x1bfc00000, data 0x462c1ba/0x47ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 172457984 unmapped: 13975552 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:02.464404+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 172605440 unmapped: 13828096 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:03.464586+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b23ca000/0x0/0x1bfc00000, data 0x4685897/0x4823000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 172736512 unmapped: 13697024 heap: 186433536 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:04.464745+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.552519798s of 10.008470535s, submitted: 84
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 172752896 unmapped: 14729216 heap: 187482112 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:05.464909+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 173056000 unmapped: 14426112 heap: 187482112 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b2369000/0x0/0x1bfc00000, data 0x46e4c3c/0x4883000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2206302 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:06.465013+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 173424640 unmapped: 14057472 heap: 187482112 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:07.465194+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 173506560 unmapped: 13975552 heap: 187482112 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 50
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:08.465349+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b233d000/0x0/0x1bfc00000, data 0x4712734/0x48b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174555136 unmapped: 13975552 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:09.465510+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174309376 unmapped: 14221312 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:10.465657+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174366720 unmapped: 14163968 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2214226 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:11.465815+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174383104 unmapped: 14147584 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:12.465991+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174112768 unmapped: 14417920 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:13.466204+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 174112768 unmapped: 14417920 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b2283000/0x0/0x1bfc00000, data 0x47cd174/0x496b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:14.466309+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.332860947s of 10.199210167s, submitted: 353
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 175161344 unmapped: 13369344 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:15.466449+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 175251456 unmapped: 13279232 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2234032 data_alloc: 184549376 data_used: 11730944
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:16.466623+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 175251456 unmapped: 13279232 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:17.466783+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b21da000/0x0/0x1bfc00000, data 0x48757a5/0x4a13000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 175251456 unmapped: 13279232 heap: 188530688 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:18.466942+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176349184 unmapped: 14278656 heap: 190627840 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:19.467095+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176365568 unmapped: 14262272 heap: 190627840 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 heartbeat osd_stat(store_statfs(0x1b214d000/0x0/0x1bfc00000, data 0x490085f/0x4a9f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:20.467245+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176439296 unmapped: 14188544 heap: 190627840 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2243468 data_alloc: 184549376 data_used: 11743232
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:21.467399+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 244 heartbeat osd_stat(store_statfs(0x1b2108000/0x0/0x1bfc00000, data 0x4945a5e/0x4ae4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176750592 unmapped: 14925824 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:22.467566+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176889856 unmapped: 14786560 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:23.467765+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176889856 unmapped: 14786560 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:24.467936+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176914432 unmapped: 14761984 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:25.468083+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176914432 unmapped: 14761984 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2248770 data_alloc: 184549376 data_used: 11743232
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:26.468699+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176914432 unmapped: 14761984 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 244 heartbeat osd_stat(store_statfs(0x1b20bf000/0x0/0x1bfc00000, data 0x498c04f/0x4b2c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:27.468918+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.765235901s of 12.614546776s, submitted: 178
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176914432 unmapped: 14761984 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:28.470373+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176914432 unmapped: 14761984 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:29.470605+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176922624 unmapped: 14753792 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:30.470815+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176939008 unmapped: 14737408 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2247866 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:31.471116+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176955392 unmapped: 14721024 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:32.471417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20bf000/0x0/0x1bfc00000, data 0x498e2fe/0x4b2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176955392 unmapped: 14721024 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:33.472024+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176963584 unmapped: 14712832 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:34.472314+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176963584 unmapped: 14712832 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:35.472512+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176963584 unmapped: 14712832 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2250664 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:36.472690+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176963584 unmapped: 14712832 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:37.472854+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.727860451s of 10.007514000s, submitted: 60
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20ba000/0x0/0x1bfc00000, data 0x498e521/0x4b31000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176971776 unmapped: 14704640 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:38.473141+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176971776 unmapped: 14704640 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:39.473320+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176971776 unmapped: 14704640 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:40.473555+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176988160 unmapped: 14688256 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2252018 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:41.473748+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176988160 unmapped: 14688256 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:42.473907+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20be000/0x0/0x1bfc00000, data 0x498e502/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 176988160 unmapped: 14688256 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:43.474077+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177012736 unmapped: 14663680 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:44.474462+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177012736 unmapped: 14663680 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:45.474663+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 14655488 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:46.474829+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2250672 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 14655488 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20c0000/0x0/0x1bfc00000, data 0x498e538/0x4b2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:47.474944+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.843222618s of 10.030783653s, submitted: 38
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 14655488 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:48.475185+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 14655488 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:49.475341+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 14655488 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:50.475542+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20be000/0x0/0x1bfc00000, data 0x498e506/0x4b2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178077696 unmapped: 13598720 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:51.475746+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2252808 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20bc000/0x0/0x1bfc00000, data 0x498e5bc/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178077696 unmapped: 13598720 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:52.475944+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178077696 unmapped: 13598720 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:53.476149+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178085888 unmapped: 13590528 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:54.476323+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178085888 unmapped: 13590528 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:55.476487+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178085888 unmapped: 13590528 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:56.489054+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2253064 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20bf000/0x0/0x1bfc00000, data 0x498e5f1/0x4b2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178102272 unmapped: 13574144 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:57.489282+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178102272 unmapped: 13574144 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:58.489849+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178102272 unmapped: 13574144 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:04:59.489997+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.006628036s of 12.199718475s, submitted: 41
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178102272 unmapped: 13574144 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:00.490125+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20bd000/0x0/0x1bfc00000, data 0x498e752/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20be000/0x0/0x1bfc00000, data 0x498e761/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178110464 unmapped: 13565952 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:01.490317+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2253278 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178110464 unmapped: 13565952 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:02.490507+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178110464 unmapped: 13565952 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:03.491374+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b20be000/0x0/0x1bfc00000, data 0x498e71d/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178110464 unmapped: 13565952 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:04.491584+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178110464 unmapped: 13565952 heap: 191676416 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4e000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:05.491829+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178577408 unmapped: 29892608 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 heartbeat osd_stat(store_statfs(0x1b18bc000/0x0/0x1bfc00000, data 0x518e81a/0x532f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:06.492104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2364680 data_alloc: 184549376 data_used: 11755520
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 246 ms_handle_reset con 0x55795bd4e000 session 0x55795a21f860
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bb04000
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178585600 unmapped: 29884416 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:07.492241+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 ms_handle_reset con 0x55795bb04000 session 0x5579583025a0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178593792 unmapped: 29876224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:08.492476+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:09.492704+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.277527809s of 10.602755547s, submitted: 64
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:10.492824+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:11.492997+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2262216 data_alloc: 184549376 data_used: 11767808
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 heartbeat osd_stat(store_statfs(0x1b20b7000/0x0/0x1bfc00000, data 0x4992fe4/0x4b36000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:12.493261+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:13.493524+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:14.493698+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 heartbeat osd_stat(store_statfs(0x1b20b7000/0x0/0x1bfc00000, data 0x4992fe4/0x4b36000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:15.493841+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:16.493950+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2263166 data_alloc: 184549376 data_used: 11767808
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 heartbeat osd_stat(store_statfs(0x1b20b7000/0x0/0x1bfc00000, data 0x499307f/0x4b37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:17.494104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:18.494299+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:19.494498+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x4995307/0x4b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:20.494715+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:21.494932+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2267016 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x4995307/0x4b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:22.495087+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:23.495333+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x4995307/0x4b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178618368 unmapped: 29851648 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:24.495497+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.317897797s of 14.416247368s, submitted: 24
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 29843456 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:25.495674+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 29843456 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:26.495851+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266440 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 29843456 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:27.496103+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 29843456 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:28.496263+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x499533f/0x4b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178626560 unmapped: 29843456 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:29.496443+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x499533f/0x4b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178634752 unmapped: 29835264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:30.496641+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178634752 unmapped: 29835264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:31.496790+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2265990 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178634752 unmapped: 29835264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b4000/0x0/0x1bfc00000, data 0x499536e/0x4b3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:32.496944+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178634752 unmapped: 29835264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:33.497194+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:34.497356+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 ms_handle_reset con 0x557959aff400 session 0x55795a250960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: handle_auth_request added challenge on 0x55795bd4f800
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:35.497554+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:36.497733+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2264434 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:37.497911+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x499536d/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:38.498118+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:39.498272+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.107774734s of 15.191800117s, submitted: 16
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:40.498417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:41.498555+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266202 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:42.498741+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995408/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:43.498925+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:44.499112+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:45.499377+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:46.499533+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266202 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995408/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:47.499684+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:48.499859+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:49.500063+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:50.500261+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.803414345s of 10.814392090s, submitted: 2
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:51.500474+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2267602 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b4000/0x0/0x1bfc00000, data 0x49954a3/0x4b3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:52.500671+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:53.500940+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:54.501104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:55.501277+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:56.501428+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2267426 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:57.501616+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b4000/0x0/0x1bfc00000, data 0x49954a3/0x4b3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:58.501830+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178642944 unmapped: 29827072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:05:59.502047+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178651136 unmapped: 29818880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:00.502272+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178651136 unmapped: 29818880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:01.502419+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266928 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x499559c/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178651136 unmapped: 29818880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:02.502662+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.966525078s of 12.025456429s, submitted: 12
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:03.502822+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995666/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:04.503468+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:05.503807+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995666/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:06.503971+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266414 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x4995695/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:07.505220+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x4995695/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:08.505378+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:09.506269+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:10.506583+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:11.506769+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x4995695/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178659328 unmapped: 29810688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2266414 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x4995695/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:12.507308+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:13.507563+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:14.508141+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b6000/0x0/0x1bfc00000, data 0x4995695/0x4b38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.139571190s of 12.184631348s, submitted: 9
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:15.508567+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:16.508866+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995730/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2268182 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:17.509071+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:18.509304+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:19.509569+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 29802496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b5000/0x0/0x1bfc00000, data 0x4995730/0x4b39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:20.510098+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178675712 unmapped: 29794304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:21.510263+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178675712 unmapped: 29794304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2272942 data_alloc: 184549376 data_used: 11780096
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b20b2000/0x0/0x1bfc00000, data 0x4995901/0x4b3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:22.510496+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178675712 unmapped: 29794304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:23.510667+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178692096 unmapped: 29777920 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:24.510853+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178692096 unmapped: 29777920 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 248 handle_osd_map epochs [248,249], i have 248, src has [1,249]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.042035103s of 10.128713608s, submitted: 16
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 249 heartbeat osd_stat(store_statfs(0x1b20b4000/0x0/0x1bfc00000, data 0x49958fa/0x4b3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:25.511070+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178692096 unmapped: 29777920 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:26.511273+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178700288 unmapped: 29769728 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2274080 data_alloc: 184549376 data_used: 11792384
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 249 heartbeat osd_stat(store_statfs(0x1b20b1000/0x0/0x1bfc00000, data 0x4997cfe/0x4b3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:27.511484+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178700288 unmapped: 29769728 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:28.511669+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 250 heartbeat osd_stat(store_statfs(0x1b20ad000/0x0/0x1bfc00000, data 0x4999f86/0x4b40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,0,0,1])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:29.720483+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:30.720629+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:31.720808+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2277050 data_alloc: 184549376 data_used: 11792384
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:32.720997+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 250 heartbeat osd_stat(store_statfs(0x1b20ad000/0x0/0x1bfc00000, data 0x499a050/0x4b40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:33.721168+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178708480 unmapped: 29761536 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:34.721306+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178716672 unmapped: 29753344 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:35.721473+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.490441322s of 10.721129417s, submitted: 95
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178716672 unmapped: 29753344 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:36.721663+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2277072 data_alloc: 184549376 data_used: 11792384
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178724864 unmapped: 29745152 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:37.721832+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178724864 unmapped: 29745152 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 250 heartbeat osd_stat(store_statfs(0x1b20ae000/0x0/0x1bfc00000, data 0x499a1e4/0x4b40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:38.721975+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178724864 unmapped: 29745152 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:39.722160+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:40.722331+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:41.722513+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2281274 data_alloc: 184549376 data_used: 11804672
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:42.722682+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:43.722853+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 251 heartbeat osd_stat(store_statfs(0x1b20a9000/0x0/0x1bfc00000, data 0x499c6b9/0x4b44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:44.723062+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178757632 unmapped: 29712384 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:45.723199+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.905615807s of 10.007020950s, submitted: 45
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 51
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178765824 unmapped: 29704192 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:46.723656+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2281312 data_alloc: 184549376 data_used: 11804672
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 251 heartbeat osd_stat(store_statfs(0x1b20aa000/0x0/0x1bfc00000, data 0x499c78b/0x4b44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178765824 unmapped: 29704192 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:47.723837+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178774016 unmapped: 29696000 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:48.724007+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178774016 unmapped: 29696000 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:49.724165+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:50.724356+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a6000/0x0/0x1bfc00000, data 0x499eba7/0x4b48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:51.724541+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2284778 data_alloc: 184549376 data_used: 11816960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a6000/0x0/0x1bfc00000, data 0x499eba7/0x4b48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:52.724685+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:53.724837+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:54.724980+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:55.725124+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.895042419s of 10.007842064s, submitted: 28
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:56.725264+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2284602 data_alloc: 184549376 data_used: 11816960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:57.725423+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a6000/0x0/0x1bfc00000, data 0x499ec0c/0x4b48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:58.725584+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:06:59.725835+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178782208 unmapped: 29687808 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:00.725986+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178790400 unmapped: 29679616 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a4000/0x0/0x1bfc00000, data 0x499ed42/0x4b4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:01.726144+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178790400 unmapped: 29679616 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2287802 data_alloc: 184549376 data_used: 11816960
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a4000/0x0/0x1bfc00000, data 0x499ee14/0x4b4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:02.726318+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178798592 unmapped: 29671424 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:03.726485+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178806784 unmapped: 29663232 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a5000/0x0/0x1bfc00000, data 0x499ee43/0x4b49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:04.726671+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178806784 unmapped: 29663232 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 heartbeat osd_stat(store_statfs(0x1b20a5000/0x0/0x1bfc00000, data 0x499ee43/0x4b49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1447597336' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:05.726828+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178839552 unmapped: 29630464 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.078696251s of 10.239264488s, submitted: 82
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:06.726989+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178839552 unmapped: 29630464 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2292776 data_alloc: 184549376 data_used: 11829248
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:07.727111+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178839552 unmapped: 29630464 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 254 heartbeat osd_stat(store_statfs(0x1b209f000/0x0/0x1bfc00000, data 0x49a1348/0x4b4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:08.727279+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178847744 unmapped: 29622272 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:09.727433+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178847744 unmapped: 29622272 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:10.727646+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178872320 unmapped: 29597696 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:11.727760+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178872320 unmapped: 29597696 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2297514 data_alloc: 184549376 data_used: 11841536
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:12.727957+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178872320 unmapped: 29597696 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 255 heartbeat osd_stat(store_statfs(0x1b2097000/0x0/0x1bfc00000, data 0x49a5ac5/0x4b56000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:13.728138+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178880512 unmapped: 29589504 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:14.728329+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178880512 unmapped: 29589504 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:15.728495+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178880512 unmapped: 29589504 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:16.728691+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178880512 unmapped: 29589504 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2298032 data_alloc: 184549376 data_used: 11841536
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:17.728807+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178880512 unmapped: 29589504 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.345253944s of 12.466576576s, submitted: 60
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:18.728977+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178888704 unmapped: 29581312 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 heartbeat osd_stat(store_statfs(0x1b2090000/0x0/0x1bfc00000, data 0x49aa187/0x4b5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:19.729119+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178896896 unmapped: 29573120 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:20.729268+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178896896 unmapped: 29573120 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:21.729417+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178896896 unmapped: 29573120 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2305820 data_alloc: 184549376 data_used: 11853824
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 heartbeat osd_stat(store_statfs(0x1b2090000/0x0/0x1bfc00000, data 0x49aa2ec/0x4b5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:22.729564+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178896896 unmapped: 29573120 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:23.762265+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178905088 unmapped: 29564928 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:24.762487+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178905088 unmapped: 29564928 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 heartbeat osd_stat(store_statfs(0x1b2090000/0x0/0x1bfc00000, data 0x49aa2ec/0x4b5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 257 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:25.762654+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178921472 unmapped: 29548544 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:26.762779+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178921472 unmapped: 29548544 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2309654 data_alloc: 184549376 data_used: 11866112
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:27.762952+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178921472 unmapped: 29548544 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 handle_osd_map epochs [258,259], i have 259, src has [1,259]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:28.763082+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178929664 unmapped: 29540352 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:29.763222+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178929664 unmapped: 29540352 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 heartbeat osd_stat(store_statfs(0x1b2085000/0x0/0x1bfc00000, data 0x49aeab5/0x4b68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:30.763400+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178929664 unmapped: 29540352 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.268356323s of 12.544970512s, submitted: 109
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:31.763556+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178954240 unmapped: 29515776 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2314108 data_alloc: 184549376 data_used: 11870208
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:32.763727+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178954240 unmapped: 29515776 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:33.763936+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178954240 unmapped: 29515776 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 heartbeat osd_stat(store_statfs(0x1b2087000/0x0/0x1bfc00000, data 0x49aeae4/0x4b67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:34.764148+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178954240 unmapped: 29515776 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 heartbeat osd_stat(store_statfs(0x1b2087000/0x0/0x1bfc00000, data 0x49aeae4/0x4b67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 259 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:35.764298+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 178970624 unmapped: 29499392 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:36.764528+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180027392 unmapped: 28442624 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323062 data_alloc: 184549376 data_used: 11882496
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:37.764706+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180027392 unmapped: 28442624 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:38.765078+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180027392 unmapped: 28442624 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 261 handle_osd_map epochs [260,261], i have 261, src has [1,261]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:39.765403+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 28434432 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:40.765563+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 28434432 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 261 heartbeat osd_stat(store_statfs(0x1b2081000/0x0/0x1bfc00000, data 0x49b3388/0x4b6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:41.766032+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 28434432 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2320644 data_alloc: 184549376 data_used: 11882496
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:42.766326+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180035584 unmapped: 28434432 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _renew_subs
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.845068932s of 12.108688354s, submitted: 97
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:43.766501+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:44.766966+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:45.767364+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:46.767714+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324478 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:47.767986+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:48.768206+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:49.768465+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:50.768685+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:51.768910+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324654 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:52.769120+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:53.769410+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:54.769906+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:55.770265+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:56.770425+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324654 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:57.770809+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:58.771013+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:07:59.771378+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:00.771549+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:01.771746+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324654 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:02.771941+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:03.772175+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:04.772381+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207c000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180043776 unmapped: 28426240 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.024707794s of 22.064247131s, submitted: 18
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:05.772534+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180387840 unmapped: 28082176 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 52
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:06.772719+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180396032 unmapped: 28073984 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:07.772904+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180396032 unmapped: 28073984 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:08.773048+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180396032 unmapped: 28073984 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:09.773251+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180396032 unmapped: 28073984 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:10.773450+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180404224 unmapped: 28065792 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:11.773653+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180412416 unmapped: 28057600 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:12.773773+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180412416 unmapped: 28057600 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:13.773962+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 28049408 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:14.774147+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 28049408 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:15.774286+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 28049408 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:16.774451+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 28049408 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:17.774656+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 28049408 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:18.774806+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:19.775010+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:20.775191+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:21.775332+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:22.775537+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:23.775721+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:24.775933+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:25.776106+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180428800 unmapped: 28041216 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:26.776272+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180436992 unmapped: 28033024 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:27.776419+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180445184 unmapped: 28024832 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:28.776589+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180445184 unmapped: 28024832 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:29.776751+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180453376 unmapped: 28016640 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:30.776935+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180461568 unmapped: 28008448 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 21K writes, 7586 syncs, 2.88 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 36.14 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4718 syncs, 2.46 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:31.777175+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180461568 unmapped: 28008448 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:32.777340+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180461568 unmapped: 28008448 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:33.777534+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180461568 unmapped: 28008448 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:34.777739+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180461568 unmapped: 28008448 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:35.777942+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:36.778157+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:37.778326+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:38.778490+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:39.778685+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:40.778947+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:41.779173+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:42.779345+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:43.779625+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:44.780411+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:45.780605+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:46.780926+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180469760 unmapped: 28000256 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _send_mon_message to mon.np0005626463 at v2:172.18.0.103:3300/0
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:47.781767+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180477952 unmapped: 27992064 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:48.782167+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180477952 unmapped: 27992064 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:49.782409+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180477952 unmapped: 27992064 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:50.782576+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180477952 unmapped: 27992064 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:51.782794+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:52.782943+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:53.783200+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:54.783690+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:55.784090+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:56.784330+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:57.784501+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:58.784663+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180486144 unmapped: 27983872 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:08:59.784956+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180494336 unmapped: 27975680 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:00.785104+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180494336 unmapped: 27975680 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:01.785329+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180494336 unmapped: 27975680 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:02.785548+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180494336 unmapped: 27975680 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:03.785857+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 180494336 unmapped: 27975680 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:04.786084+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 59.831336975s of 59.878295898s, submitted: 258
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 ms_handle_reset con 0x557957ff0000 session 0x557958428780
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:05.786321+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Got map version 53
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2356945423,v1:172.18.0.107:6811/2356945423]
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:06.786533+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:07.786716+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:08.787039+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:09.787188+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181665792 unmapped: 26804224 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:10.787425+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:11.787642+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:12.787827+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:13.788135+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:14.788361+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:15.788559+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:16.788729+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:17.788860+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:18.789066+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:19.789235+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:20.789489+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:21.789684+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:22.789892+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181682176 unmapped: 26787840 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:23.790117+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:24.790241+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:25.790451+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:26.790638+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:27.790918+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:28.791115+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181706752 unmapped: 26763264 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:29.791274+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181714944 unmapped: 26755072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:30.791452+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181714944 unmapped: 26755072 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:31.791658+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:32.791929+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:33.792168+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:34.792349+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:35.792509+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:36.792685+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:37.792939+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:38.793115+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:39.793319+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:40.793526+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:41.793686+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:42.793856+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:43.794117+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:44.794271+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:45.794473+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:46.794669+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181723136 unmapped: 26746880 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:47.794859+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181731328 unmapped: 26738688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:48.795779+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181731328 unmapped: 26738688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:49.796437+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181731328 unmapped: 26738688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:50.796592+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181731328 unmapped: 26738688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:51.796715+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181731328 unmapped: 26738688 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:52.797097+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181739520 unmapped: 26730496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:53.798042+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181739520 unmapped: 26730496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:54.798204+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181739520 unmapped: 26730496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:55.798352+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:56.798674+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:57.798840+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:58.798960+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:09:59.799654+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:00.799808+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:01.799986+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:02.800112+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:03.800281+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:04.800436+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181747712 unmapped: 26722304 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:05.800603+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181739520 unmapped: 26730496 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:06.800747+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'config diff' '{prefix=config diff}'
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'config show' '{prefix=config show}'
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'counter dump' '{prefix=counter dump}'
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'counter schema' '{prefix=counter schema}'
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 182009856 unmapped: 26460160 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:07.800876+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: bluestore.MempoolThread(0x557956d2db60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2323774 data_alloc: 184549376 data_used: 11894784
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: osd.2 262 heartbeat osd_stat(store_statfs(0x1b207d000/0x0/0x1bfc00000, data 0x49b5630/0x4b71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,3,4,5] op hist [])
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: prioritycache tune_memory target: 3561601228 mapped: 181862400 unmapped: 26607616 heap: 208470016 old mem: 2222054675 new mem: 2222054675
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: tick
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_tickets
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-23T10:10:08.800995+0000)
Feb 23 10:10:39 np0005626463.localdomain ceph-osd[31633]: do_command 'log dump' '{prefix=log dump}'
Feb 23 10:10:39 np0005626463.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.705899992 +0000 UTC m=+0.069456519 container create 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Feb 23 10:10:39 np0005626463.localdomain systemd[1]: Started libpod-conmon-7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13.scope.
Feb 23 10:10:39 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.760100667 +0000 UTC m=+0.123657204 container init 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, version=7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main)
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.681314241 +0000 UTC m=+0.044870788 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.775161202 +0000 UTC m=+0.138717729 container start 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1770267347, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.42.2, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.775574006 +0000 UTC m=+0.139130563 container attach 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, release=1770267347, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 23 10:10:39 np0005626463.localdomain systemd[1]: libpod-7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13.scope: Deactivated successfully.
Feb 23 10:10:39 np0005626463.localdomain reverent_wright[327073]: 167 167
Feb 23 10:10:39 np0005626463.localdomain podman[327052]: 2026-02-23 10:10:39.777416782 +0000 UTC m=+0.140973309 container died 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, version=7, io.buildah.version=1.42.2, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7)
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1467611387' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3021186186' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1072074079' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/4266852345' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1416433115' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/4145061573' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2621701882' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1587917973' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1961588872' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1578668231' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1447597336' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: pgmap v766: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/403833915' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1018236652' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/422682873' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1206515708' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 23 10:10:39 np0005626463.localdomain podman[327081]: 2026-02-23 10:10:39.859269152 +0000 UTC m=+0.071623544 container remove 7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_wright, io.buildah.version=1.42.2, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347)
Feb 23 10:10:39 np0005626463.localdomain systemd[1]: libpod-conmon-7ede5a6cc6c6159099fb810327b3a7e0aa1e0688ebd809510768d2d570f28a13.scope: Deactivated successfully.
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb 23 10:10:39 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1483951900' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:40.012583922 +0000 UTC m=+0.029855983 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2332952222' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:40.13961586 +0000 UTC m=+0.156887871 container create 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, architecture=x86_64)
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.250 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.251 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.251 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.251 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.251 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:10:40 np0005626463.localdomain systemd[1]: Started libpod-conmon-237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b.scope.
Feb 23 10:10:40 np0005626463.localdomain systemd[1]: Started libcrun container.
Feb 23 10:10:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43197119cdcf71fa4564257b70a616bb1834fe7cad5a169aae508a563fc912dc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 23 10:10:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43197119cdcf71fa4564257b70a616bb1834fe7cad5a169aae508a563fc912dc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 23 10:10:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43197119cdcf71fa4564257b70a616bb1834fe7cad5a169aae508a563fc912dc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 23 10:10:40 np0005626463.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43197119cdcf71fa4564257b70a616bb1834fe7cad5a169aae508a563fc912dc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:40.429539582 +0000 UTC m=+0.446811583 container init 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, vcs-type=git, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, version=7, RELEASE=main, io.buildah.version=1.42.2)
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:40.444066671 +0000 UTC m=+0.461338672 container start 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git)
Feb 23 10:10:40 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:40.444525565 +0000 UTC m=+0.461797576 container attach 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1513206490' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.678 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:10:40 np0005626463.localdomain systemd[1]: tmp-crun.MySyqv.mount: Deactivated successfully.
Feb 23 10:10:40 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-0e6ae9d615a848ee1477ff1f14f88d1c62a37b1b758b6607a16f502d03f6cabc-merged.mount: Deactivated successfully.
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.736 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.737 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2179467379' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1162712184' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1803047198' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1483951900' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2499614800' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2332952222' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.59332 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.69506 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1230008388' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.59344 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1579920425' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1513206490' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.69521 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.59350 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/188295196' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2179467379' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.930 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.931 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=10915MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.931 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:10:40 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:40.931 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.001 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0)
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.033 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]: [
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:     {
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "available": false,
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "ceph_device": false,
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "lsm_data": {},
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "lvs": [],
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "path": "/dev/sr0",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "rejected_reasons": [
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "Has a FileSystem",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "Insufficient space (<5GB)"
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         ],
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         "sys_api": {
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "actuators": null,
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "device_nodes": "sr0",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "human_readable_size": "482.00 KB",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "id_bus": "ata",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "model": "QEMU DVD-ROM",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "nr_requests": "2",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "partitions": {},
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "path": "/dev/sr0",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "removable": "1",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "rev": "2.5+",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "ro": "0",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "rotational": "1",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "sas_address": "",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "sas_device_handle": "",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "scheduler_mode": "mq-deadline",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "sectors": 0,
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "sectorsize": "2048",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "size": 493568.0,
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "support_discard": "0",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "type": "disk",
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:             "vendor": "QEMU"
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:         }
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]:     }
Feb 23 10:10:41 np0005626463.localdomain jolly_bohr[327449]: ]
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain systemd[1]: libpod-237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b.scope: Deactivated successfully.
Feb 23 10:10:41 np0005626463.localdomain podman[327125]: 2026-02-23 10:10:41.272987387 +0000 UTC m=+1.290259388 container died 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 23 10:10:41 np0005626463.localdomain systemd[1]: var-lib-containers-storage-overlay-43197119cdcf71fa4564257b70a616bb1834fe7cad5a169aae508a563fc912dc-merged.mount: Deactivated successfully.
Feb 23 10:10:41 np0005626463.localdomain podman[329466]: 2026-02-23 10:10:41.352721452 +0000 UTC m=+0.064710052 container remove 237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_bohr, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, release=1770267347, ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Feb 23 10:10:41 np0005626463.localdomain systemd[1]: libpod-conmon-237e071dbf5474324d903ad57e7173515e62caa26152637d383e90af92d5125b.scope: Deactivated successfully.
Feb 23 10:10:41 np0005626463.localdomain sudo[326905]: pam_unix(sudo:session): session closed for user root
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.464 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.471 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.482 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.484 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.484 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 23 10:10:41 np0005626463.localdomain sudo[329500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 23 10:10:41 np0005626463.localdomain sudo[329500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 23 10:10:41 np0005626463.localdomain sudo[329500]: pam_unix(sudo:session): session closed for user root
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.607 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.641 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 23 10:10:41 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:41.643 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.59359 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.69536 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1162712184' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.69545 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.59371 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.49575 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.69551 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.49581 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.69560 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/4127465571' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.59386 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1820329700' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: from='client.49596 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:41 np0005626463.localdomain ceph-mon[294160]: pgmap v767: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:41 np0005626463.localdomain systemd[1]: Starting Hostname Service...
Feb 23 10:10:42 np0005626463.localdomain systemd[1]: Started Hostname Service.
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "versions"} v 0)
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2133596257' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/832427830' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.49605 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.69581 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.59401 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.49614 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/1811126054' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3794353973' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.69602 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.59407 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.49629 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2431656442' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2133596257' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 23 10:10:42 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/832427830' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "versions"} v 0)
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2399433245' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:10:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 23 10:10:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:10:43 np0005626463.localdomain openstack_network_exporter[245358]: ERROR   10:10:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 23 10:10:43 np0005626463.localdomain openstack_network_exporter[245358]: 
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/754933428' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.59425 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.69620 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.49650 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3287719225' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2553256981' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2399433245' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.49668 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1007018253' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: pgmap v768: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/754933428' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.49692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2540061614' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 23 10:10:43 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/257104403' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:44.483 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:44 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:44.484 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1799386145' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/257104403' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.59473 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/2333321494' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/917282255' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.69701 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1799386145' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3292624239' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 23 10:10:44 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3234225218' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1778361225' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.49761 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2510509450' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3057711078' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: pgmap v769: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/1778361225' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2938499830' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/3316535170' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 23 10:10:45 np0005626463.localdomain ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' 
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df"} v 0)
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2603396901' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/276361435' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:46.643 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:46 np0005626463.localdomain nova_compute[282206]: 2026-02-23 10:10:46.646 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/2603396901' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: from='client.59512 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1206180281' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/276361435' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/3662158986' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "fs ls"} v 0)
Feb 23 10:10:46 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3701863891' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 23 10:10:47 np0005626463.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 23 10:10:47 np0005626463.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 23 10:10:47 np0005626463.localdomain kernel: cfg80211: failed to load regulatory.db
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump"} v 0)
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4278975874' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.
Feb 23 10:10:47 np0005626463.localdomain systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mds stat"} v 0)
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/930001315' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain podman[330346]: 2026-02-23 10:10:47.933031347 +0000 UTC m=+0.105328036 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.69758 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/3701863891' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/1240231362' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.107:0/2630354100' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.49812 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: pgmap v770: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.108:0/4278975874' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.59533 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 23 10:10:47 np0005626463.localdomain ceph-mon[294160]: from='client.? 172.18.0.106:0/930001315' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 23 10:10:47 np0005626463.localdomain podman[330346]: 2026-02-23 10:10:47.969836436 +0000 UTC m=+0.142133195 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 23 10:10:47 np0005626463.localdomain systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully.
Feb 23 10:10:48 np0005626463.localdomain podman[330345]: 2026-02-23 10:10:47.973152628 +0000 UTC m=+0.146008924 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-02-05T04:57:10Z)
Feb 23 10:10:48 np0005626463.localdomain podman[330345]: 2026-02-23 10:10:48.055322188 +0000 UTC m=+0.228178524 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, build-date=2026-02-05T04:57:10Z, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 23 10:10:48 np0005626463.localdomain systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully.
Feb 23 10:10:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 23 10:10:48 np0005626463.localdomain ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump"} v 0)
Feb 23 10:10:48 np0005626463.localdomain ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3412827818' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 23 10:10:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:10:48.570 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 23 10:10:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:10:48.571 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 23 10:10:48 np0005626463.localdomain ovn_metadata_agent[163567]: 2026-02-23 10:10:48.571 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
